Limits of Detection:
Earlier, we mentioned that the uncertainty in the intercept of the regression line had implications for the lowest detectable signal and corresponding concentration. Remember that there is always some error (or uncertainty) associated with any measurement, even if that measurement is taken under conditions for which no analyte is present - the blank, background, or baseline measurement.
If we could continuously monitor the raw electrical signal (voltage or current) inside an instrument during a set of measurements, we might see something like the diagram shown below: here, the signal is subject to both short-term (high frequency) and long-term (low frequency) noise, as well as some long-term drift. The red line shows the drift in the background, or baseline, signal, while the green line shows that in the sample signal. The blue line represents the average difference between the sample and background, which is the value our instrument reports to the user.
Continuous monitoring of the raw signal within an instrument
In terms of the instrument signal, therefore, we are interested in determining the smallest signal that is distinguishable from the background (baseline) noise. Various criteria have been applied in quantifying this limit, but the generally accepted rule is that the signal must be at least three times greater than the backgound noise.
When performing chemical measurements, however, we are more interested in the concentration of the species being measured within a particular sample that is necessary in order to generate the minimum detectable signal. We can therefore distinguish between the measurement limit of detection (yLOD) and the concentration limit of detection (CLOD).