by 
mdevay » Wed Jun 08, 2016 7:44 pm
													
 
					
						This was a recurring issue in our 524.2 analysis. We calibrated 74 compounds (including IS and Surrogates) from .5ppb to 100ppb (we're lazy and hate dilutions for xylenes). The problem was far more pronounced on our more sensitive 5975c's and 5977s than our 5973s. 
Cleaning the source and performing inlet maintenance improved performance for a short while, but within about 6 weeks, the higher standards would start to creep up to the point that the 10ppb and 20ppb CCV standards we ran would fail surrogate recovery. This was coupled with a gradual decrease in our IS/SS recoveries, and a corresponding increase in recovery (though a nearly stable response) in most of our lighter analytes, along with a stable recovery/lowered response in our ring based analytes, particularly the aromatics. Calibrations would go from extremely linear to grossly quadratic for our heavier analytes, and responses for these became heavily dependent on the total concentration of heavier compounds in the sample. Therefore, the recoveries and responses for anything aromatic in our samples could be wildly different from what they should be.
We ended up figuring that there was preferential adsorption happening throughout the system for our aromatics. Lowering the split ratio and increasing the amount of internal standards mitigated this problem significantly. This, however, leads to more water entering the system and affecting repeatability of our more polar compounds (eg ketones, thf). We ended up biting the bullet and dropping our calibration curve high point from 100 to 20, which improved linearity and slowed the problem down somewhat, but didn't entirely solve it. Using inert silcosteel unions for our transfer line also helped a bit, but the problem seemed to be rather systemic, and the only way to get rid of it entirely was doing a complete maintenance cycle every 6 weeks or so (source cleaning, liner, gold seal, column trim). Another interesting thing that was happening was that our gold seals would develop a large amount of black deposits. 
Last I heard, they were considering experimenting with changing the inlet and transfer line temperatures, but I have since left state employment for better opportunities and couldn't tell you if it worked. If the inlet temp was too high and causing gradual degradation of the more fragile compounds after purging, sooty deposits could definitely cause adsorption at the inlet. Our sources never got so dirty that a semis-volatile analyst would even think twice about it, but the same kind of issue may have been happening on the back end of the column and the opening to the source itself, as the MS interface temperature is very high as well.