Purge&Trap GCMS internal standard increase, new observations
Posted: Fri Jun 12, 2015 2:33 pm
				
				Subject line isn't really long enough to fully explain the topic but here it is.
There have been discussions here in the past about the problem we all have experienced at some time where we see the internal standard response increase with increasing calibration standard levels. Usually it is most prominent with the 1,4-Dichlorobenzene-d4 and the later eluting analytes. The problem is that as we increase the concentration of the calibration standards in the curve, the internal standard response increases instead of remaining steady as it should. For a long time it has been postulated that this increase comes from the amount of methanol added to the standards and that with more methanol added you get increased response for the internal standards. EPA and others have suggested that to combat the problem you use no more than 20ul of methanol per 100ml water when making a calibration curve which means you need multiple stock standards to prepare a curve over the normal range of calibration which in itself can introduce more points of variability into the calibration.
Over the years I have tried every thing from using multiple stock standards to adding extra methanol to make each standard have the same amount of methanol to using higher split ratios to see if it was water causing the problem to lowering the EM volts as one paper suggested. Some days it would seem successful then the next day the problem would return when doing the exact same thing.
Recently I have been working on a method using SIM with EPA 8260 to try to calibrate down to 0.05ppb for a project we have. To do this I was working on a calibration curve that covered the concentrations of 0.05, 0.25, 0.5, 1.0, 2.5, 5.0, 10, 20, 40ppb with 5ml purge volume and internal standard/surrogate concentrations of 5ppb. Through about 1.0ppb the internals would all have consistent responses, but above that the internal standards would rapidly increase in response, with the 1,4-Dichlorobenzene increasing to the point that it was nearly 4x the area counts in the 40ppb as it was in the 0.05ppb. I tried using the multiple stock standard concentrations but had the exact same results as using a single low concentration standard. Low standard required 1ul - 800ul per 100ml and the multiple standards had 1ul per 100ml low calibrator through 60ul for the high calibrator both having similar results. That would seem to rule out methanol as the culprit, so I also ran some blanks spiked with 800ul methanol per 100ml and they gave the same internal responses as the blanks without methanol that ran before the calibration standards thus ruling out methanol all together.
This got me thinking, what is the only other major variable in the standards? The concentration of the analytes! In this calibration the total analyte concentration of the 0.05ppb standard is 4.35ppb and the total analyte concentration of the 40ppb standard is 3480ppb. (87 analytes x concentration level). Now, if you have 7 internal standard/surrogate compounds at a constant concentration of 5ppb that gives you total concentrations ranging from 39.35ppb for the 0.05 standard to 3515ppb for the 40 standard. That is almost 9000% difference in concentration of analytes over the curve. If you increase the internal standard concentration to 50ppb then you have total concentrations ranging from 354.35ppb for the 0.05 standard to 3830ppb for the 40 standard. This gives an increase of only 1080% in total analyte concentration across the entire calibration range. Making this change the increase in the 1,4-Dichlorobenzene response across the calibration range was only 30%, instead of 200% as it was before.
I am going to run this a few more times and try to write up the findings a little more formally, and see if it will hold true and is not a one time fluke. The question now becomes what causes this effect? Since the analytes elute at different times the Mass Spec is not seeing them all at once and should not be affected by the concentrations as a total mass, therefore the concentrations of the standard analytes should not affect the response of the internal which is at a constant concentration eluting separately from the other compounds. If it is not the MS, then is it in the column, the inlet, the transfer line, the trap, the sparge tube, or a combination of all of the above?
Currently the fix would be to only calibrate from 0.05ppb to 1.0ppb which would mean needing to dilute most of the samples I run because many analytes will have concentrations above 1.0ppb while I still need the low detection limit for the others. The other fix is to use a very high concentration of internal standard which does not match up with the concentrations of the analytes quantified against those internal standards, which really isn't how good methods should work.
So, does anyone out there have any good theories as to what is actually going on with these purge and trap systems?
It was only a minor annoyance before when the shift of response was maybe 20-30% over a calibration curve that was using higher concentrations, but this ultra trace work seems to really amplify the problem.
			There have been discussions here in the past about the problem we all have experienced at some time where we see the internal standard response increase with increasing calibration standard levels. Usually it is most prominent with the 1,4-Dichlorobenzene-d4 and the later eluting analytes. The problem is that as we increase the concentration of the calibration standards in the curve, the internal standard response increases instead of remaining steady as it should. For a long time it has been postulated that this increase comes from the amount of methanol added to the standards and that with more methanol added you get increased response for the internal standards. EPA and others have suggested that to combat the problem you use no more than 20ul of methanol per 100ml water when making a calibration curve which means you need multiple stock standards to prepare a curve over the normal range of calibration which in itself can introduce more points of variability into the calibration.
Over the years I have tried every thing from using multiple stock standards to adding extra methanol to make each standard have the same amount of methanol to using higher split ratios to see if it was water causing the problem to lowering the EM volts as one paper suggested. Some days it would seem successful then the next day the problem would return when doing the exact same thing.
Recently I have been working on a method using SIM with EPA 8260 to try to calibrate down to 0.05ppb for a project we have. To do this I was working on a calibration curve that covered the concentrations of 0.05, 0.25, 0.5, 1.0, 2.5, 5.0, 10, 20, 40ppb with 5ml purge volume and internal standard/surrogate concentrations of 5ppb. Through about 1.0ppb the internals would all have consistent responses, but above that the internal standards would rapidly increase in response, with the 1,4-Dichlorobenzene increasing to the point that it was nearly 4x the area counts in the 40ppb as it was in the 0.05ppb. I tried using the multiple stock standard concentrations but had the exact same results as using a single low concentration standard. Low standard required 1ul - 800ul per 100ml and the multiple standards had 1ul per 100ml low calibrator through 60ul for the high calibrator both having similar results. That would seem to rule out methanol as the culprit, so I also ran some blanks spiked with 800ul methanol per 100ml and they gave the same internal responses as the blanks without methanol that ran before the calibration standards thus ruling out methanol all together.
This got me thinking, what is the only other major variable in the standards? The concentration of the analytes! In this calibration the total analyte concentration of the 0.05ppb standard is 4.35ppb and the total analyte concentration of the 40ppb standard is 3480ppb. (87 analytes x concentration level). Now, if you have 7 internal standard/surrogate compounds at a constant concentration of 5ppb that gives you total concentrations ranging from 39.35ppb for the 0.05 standard to 3515ppb for the 40 standard. That is almost 9000% difference in concentration of analytes over the curve. If you increase the internal standard concentration to 50ppb then you have total concentrations ranging from 354.35ppb for the 0.05 standard to 3830ppb for the 40 standard. This gives an increase of only 1080% in total analyte concentration across the entire calibration range. Making this change the increase in the 1,4-Dichlorobenzene response across the calibration range was only 30%, instead of 200% as it was before.
I am going to run this a few more times and try to write up the findings a little more formally, and see if it will hold true and is not a one time fluke. The question now becomes what causes this effect? Since the analytes elute at different times the Mass Spec is not seeing them all at once and should not be affected by the concentrations as a total mass, therefore the concentrations of the standard analytes should not affect the response of the internal which is at a constant concentration eluting separately from the other compounds. If it is not the MS, then is it in the column, the inlet, the transfer line, the trap, the sparge tube, or a combination of all of the above?
Currently the fix would be to only calibrate from 0.05ppb to 1.0ppb which would mean needing to dilute most of the samples I run because many analytes will have concentrations above 1.0ppb while I still need the low detection limit for the others. The other fix is to use a very high concentration of internal standard which does not match up with the concentrations of the analytes quantified against those internal standards, which really isn't how good methods should work.
So, does anyone out there have any good theories as to what is actually going on with these purge and trap systems?
It was only a minor annoyance before when the shift of response was maybe 20-30% over a calibration curve that was using higher concentrations, but this ultra trace work seems to really amplify the problem.
 Back then I could calibrate from 1ppb - 1000ppb on a 5970 and everything but the ketones would be dead linear less than 10% RSD average response factors and internal standard areas varied less than 10%.  I haven't seen that since we went to using the split inlet and 5973s.
  Back then I could calibrate from 1ppb - 1000ppb on a 5970 and everything but the ketones would be dead linear less than 10% RSD average response factors and internal standard areas varied less than 10%.  I haven't seen that since we went to using the split inlet and 5973s.