I have to assume, based on the information provided, that sa is the standard error (uncertainty) of the intercept and sb is the standard error (uncertainty) of the slope. I do not have enough context to respond directly regarding a suitable “low enough” value for Sa/Sb and the technical nature of this question is not in the scope of the support that can be provided. I can share some principles and best practices, however. In method development you need to determine the slope and sensitivity as well as the Limit of detection (LOD) and Limit of Quantification (LOD). The relative uncertainty near zero is usually large. This topic of linear regression needs to cover a number of topics, including an understanding of residuals (the difference between an observed y value, and the calculated y value using the fitted line equation) and regression statistics.
The objective is to set up a calibration with a good predictability of y (analyte concentration) based on instrument response through the regression equation. It is not best practice to be measuring near zero. It is advisable to start with five to seven standard concentration points, equally spaced; covering the range of interest. Include a standard blank and select the range so that the majority of test samples would fall in the centre of the calibration range because that is where the uncertainty associated with predicted concentration is the lowest. Plot and examine the residuals, do not force the intercept to zero and calculate the uncertainty (prediction interval) for test sample concentrations using the calibration equation. Depending on the purpose of the method, you need to look at the contribution of the calibration uncertainty to the overall measurement uncertainty, and determine how significant it is.
Depending on the instrument, and purpose of the method, I suggest you reach out to your supplier for some application guidelines.