by
DJtech » Fri Aug 24, 2012 1:13 pm
I experience likewise......but may be slightly different...gradient 20 - 90 % MeOH, ammonium formate pH4.1
I measure compound X and Y with respective X-D4 and Y-D4 as internal standards(D4 = 4x deuterium labelled isotope)
(the sample is a buffer pH 7.4 PBS like: 2 mM phosphate, 140 mM NaCl, Magnesium, Potassium and Calcium ~ 1-2 mM)
If I prepare a run with system suitability test at the beginning in 1 ml. vials, the X/X-D4 and Y/Y-D4 ratio looks okay RSDratio = 2.3 %
Then I have the calibration curve (in 150 µl vials inserted in the 1 ml vials)
3 injections of QC3, three inj. of QC2, three inj of QC1 (150 µl vials)
calibration curve measurement 2
injection of QC3, QC2, QC1
calibration curve measurement 3
injection of QC3, QC2, QC1
calibration curve measurement 4
injection of QC3, QC2, QC1
The QC's are each calculated back on the calibration curve run just before. After the final calibration curve (~injection 50 the QC's get into the 15% accuracy level). From injection 60 the QC's come into the 5-10 % accuracy level.
When the ratio of compound X vs. Compound Y is used throughout the same analysis run (in which it is not possible to calculate back, because of the same concentrations of X and Y in the standard solutions and QC's) the overall RSD's are : QC's : 2.3%, cal.curves : 1-10 ng/ml 4.0 %
whereas in the above setting more than 20 %
So .... depending on the internal standard, signal is highly variable and decreasing or very stable.