Advertisement

Quantitation using LC/MS

Discussions about GC-MS, LC-MS, LC-FTIR, and other "coupled" analytical techniques.

8 posts Page 1 of 1
For good precision and accuracy is it a requirement that an isotopically labeled internal standard be used that resembles the structure of the analyte? What about numerous analytes of unknown structure, where an isotope can not be used. I am referring to simple matrices.

Dear Indium,

I think it all depends on what you call good precision and accuracy. We are doing quantitative analysis in biological matrices with LC/MS/MS and never use internal standards. Still, our validation results are in compliance with FDA/EMEA guidelines.

What accuracy and precision do you need?

Regards bert

The precision should be at least 10% RSD and in some cases <2.0% RSD. The accuracy should be very high. In some cases we would be quantitating unknowns at low levels and in other cases a known at high levels. I am under the impression that the degree of ionization of a compound can vary from one run to another and especially from one instrument to another and thus result in high RSD's.

Less than 2% RSD can be difficult in LC/MS, even with solvent standards and fairly high concentrations. If you are rigid about this requirement, you may find yourself frustrated.

To answer your question, I know of no such requirment, although it is ideal if you can obtain such an internal standard. Any ISTD, ideally as similar as possible to the compound of interest, will correct for autosampler errors. If spiked into the sample prior to preparation, it will also correct for variations in the sample prep (some people call this a "surrogate"). An isotopically labeled version of your analyte, used as ISTD, has the added bonus of correcting for ion suppression or enhancement due to endogenous matrix components.

A long time ago we used to call this isotope ISTD "isotope dilution", and thought quantitative work with MS was only possible with that. Thus, one of the greatest surprises to me in "recent" analytical development is that MS quantitation is done routinly with LCMS these days.

The precision should be at least 10% RSD and in some cases <2.0% RSD. The accuracy should be very high. In some cases we would be quantitating unknowns at low levels and in other cases a known at high levels. I am under the impression that the degree of ionization of a compound can vary from one run to another and especially from one instrument to another and thus result in high RSD's.
I am a little confused by the statement that the degree of ionization can vary from one instrument to another and that this is a cause of high RSDs. Even though instruments may vary in the degree of ionization this should not affect RSDs. It doesn't matter if you ionize 5% or 50% of the compound present, as long as you ionize the same percentage from run to run you will have good RSDs.

MG, there is a big difference between an internal standard and a surrogate to those of us with an environmental background. An internal standard is added to a sample after all preparation steps are complete. In this case the concentration of the added compound is known, and can be used for quantitation. If the compound is added before sample processing any losses are assumed to be the same as those of the analytes, and you can calculate a recovery. Since you don't know the true concentration this cannot be used for quantitation.

Ron, if I assume

(1) That my surrogate has the same response factor as my analyte, and
(2) That my surrogate has the same recovery as my analyte

(both reasonable assumptions for an isotope labeled version of the analyte)

then I would argue that the surrogate can be used for quantitation, in the same manner that the internal standard (spiked after prep) is used for quantitation. In fact, this is done in some circles and what you and I would call the "surrogate", is simply called the "internal standard" and is used as such. When speaking with someone on a new project and the term "internal standard" is used, I am always careful to have them define it.

MG, the assumptions are reasonable, and you could indeed calculate concentrations spiking the internal standard before the sample preparation. If you used a compound spiked into the samples as a surrogate then you would have an idea of the efficiency of your sample preparation process, which is what we often did. If the recovery of the surrogate dropped below a defined level the preparation of that sample was suspect, or there were possible interferences present. I prefer the use of internal standard spiked after the preparation for quantitation, and the use of surrogates as quality control for sample preparation.

There is nothing inherently wrong with adding internal standard and then prepping and quantitating, but I would want to perform a thorough study using surrogates and internal standards after the sample prep to verify the whole method from preparation through analysis is robust.
8 posts Page 1 of 1

Who is online

In total there are 43 users online :: 1 registered, 0 hidden and 42 guests (based on users active over the past 5 minutes)
Most users ever online was 4374 on Fri Oct 03, 2025 12:41 am

Users browsing this forum: Ahrefs [Bot] and 42 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry