Advertisement

method detection limit study/calculations

Discussions about GC and other "gas phase" separation techniques.

6 posts Page 1 of 1
One of my goals set for me in 2007 has been to determine the minimum detection limit of each individual compound tested for in our finished products. This is aprroximately 50 compounds on two methods. How would you suggest this study be conducted?
I read that the minimal detection limit is an amount that will produce a signal distinguishable from the noise level 99% of the time. Any opinions on this?
How does everyone determine their minimum detection limits???

Thanks Again
Micheal

There are two limits that you should be concerned about.
The Lower Level of Quantification - which is usually around 10x the noise, the Level of Detection, which is usually about 3x the noise.

You normally report actual numbers for peak areas/heights above the LLOQ, and "Not Detected" or "less than XXX" ( which is the LLOQ), for compounds that aren't detected.

How you report the peaks with areas/heights between LLOQ and LOD can depend on the regulatory environment, and/or company procedures. Some organisations use "less than xxx" for such peaks, with "Not Detected" for areas/heights smaller than the LOD.

These days, regulators like to see actual confirmation of both limits, especially LOD, by using actual standards diluted to those concentrations. The quick and dirty method for determining LOD and LLOQ was to measure the noise, and use the 3x and 10x definitions. Saves the nausea of actually making up and running standards.

Most indiustries have some guidance on such matters, so a www search could be very helpful.

Bruce Hamilton

Hi,

Bruce Hamilton already explained in detail, I will try to add some extra bits of info.

To distinguish signal from noise level 99% of the time, LOD is taken 3xnoise. Random noise is deemed to follow normal distribution, and noise level is assumed equal to standard deviation. From statistics, we know for normal distributions (gaussian dist. or curve) three standard deviations cover 99% of all distribution.

LOQ or to be more specific LLOQ (low limit of quantification) (since quantification also has some upper limit) leaves some safe area for quantification. It is usually calculated as 2.5 to 3 times LOD or more directly 10 times the noise, but to my knowledge has no physical meaning.

A new approach is using terms such as "decision limit" CCalfa and "detection capability"CCbeta. An international standard, ISO 11843 explains this approach in detail. This incorporates and limits false positives and false negatives but to be fair it's rather complicated, and requires too much statistics knowledge. There are also IUPAC guides for these new "decision limit" and "detection capability", one authored by Lloyd Currie and one from Olivieri (or very similar name).
Very rough summary of the approach: A representative matrice spiked with very low amounts of analyte around the expected limit is analyzed and from calibration curve statistics, standard deviation is calculated and multiplied with some coefficient.

Personally, I wouldn't chose this "decision limit" approach, if it's not obligatory. A EU Commission Directive regarding residues in food and feed, assumed this approach for performance of analytical methods and will be obligatory for EU countries after September 2007 (5 years after directive).

Assuming there are no obligations or guides to follow for your case, I would spike my analytes (or a selected bunch if all at once is complicated) at a very low level (around the expected limit which you can easily find by serial diluting a standard solution) to your matrice for both methods.

Then analyse multiple samples (6-10) from this matrice (each going through sample preparation), and calculate standard deviation for each analyte. Then you may multiply this standard deviation with 3 to obtain LOD and 10 to LOQ.

Best luck,
Bulent

How value-added is this task?

Hi Consumer Products Guy,

I believe, not very much, in this new decision limit approach. The other suggestion of spiking low level analytes and then finding standard deviation helps to achieve methods limits, including sample prep.

I should admit, ISO 11843, or EC Decision has some point also. EC decision defines performance characteristics of analytical methods for residues in food and feed. They want to be sure (at least at a statistically defined level) that there are no false negatives or positives.

Thanks,
Bulent

If you're using validated methods, you should look at the validation reports. If there are no LOD or LOQ values in them, looking at the linear regressions for the accuracy portions should help give you a rough idea. If you look at a given system's baseline noise and compare some multiple of that (a decently large one if you're only allowed to get it wrong 1% of the time) and figure out what analyte concentration corresponds to it. Repeat for typical-high noise values from each variety of instrument used.

If there is no linear regression based accuracy mentioned in any validation report, tell your boss that you have more important goals to address!
Thanks,
DR
Image
6 posts Page 1 of 1

Who is online

In total there are 53 users online :: 1 registered, 0 hidden and 52 guests (based on users active over the past 5 minutes)
Most users ever online was 5108 on Wed Nov 05, 2025 8:51 pm

Users browsing this forum: Semrush [Bot] and 52 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry