-
- Posts: 14
- Joined: Fri Feb 22, 2019 4:34 pm
I've been doing GCMS - largely wastewater analysis for VOC content - for the better part of a decade, but this is the first time I have been involved in this part of the job. Up to this point it's all been routine, with some maintenance (GC and MS side) and troubleshooting from time to time.
My understanding, from reading up on my target analytes, is that the plasticizers and some PAHs are surprisingly sensitive to heat and light. So, to optimize my recoveries I'll want to avoid any excess of those things in the process. Also, avoid contact at any step of extraction with anything that might contain the target analytes (which, for the plasticizers, might be more an exercise in Whack-a-mole than for the rest). Run some checks on new lots of solvents, to make sure there's not contamination, and just in general document everything because of course. On to the GCMS side.
The general process of optimizing the GCMS hardware is one of making sure that the column is properly selected (length, composition, and size), the injection port is optimized (liner composition, liner geometry, septum composition), and the MS hardware is up to spec (source cleaned, electron multiplier in good working order, etc). Maintaining that is a big day-to-day thing, but so is finding the ideal starting point for method development. However, that all is also fairly straightforward.
On the software (I'm using that term rather loosely) side, we have things like the GC temperature program and MS parameters. That's where things get a bit tricky for me, as I've never been involved in temperature programming at all.
Right now the method operates on a total run time of about 38 min with a 45 C starting point, a fast ramp to 160 C, a slow ramp to 300 C, and a 10 minute hold at the end. That 10 minute hold at the end SEEMS excessive, since the last peak elutes at 28 min and my baseline creeps up past 27 min or so. Also, starting things at 45 C SEEMS low, since the lowest BP solvent (EtAc) boils at 75ish C. The first peak of interest comes out at 5.5ish minutes, which is 1.5 min past the solvent delay.
Is that thought process where I ought to be?
Operating on the notion that it is, I adjusted the temperature program so that it started at 65 C and shortened that hold at the end to be 2.5 min long. After I did, I saw the average responses for basically everything drop by about a factor of ~2 (from ~450K to ~225K for the largest non-target, but consistent throughout).
Ought I to reconsider the program changes, based solely on that response change? The change moved one of the leading non-target peaks into the solvent delay period, the column bleed at the end was largely eliminated, and it significantly shortened the run time. But, a short(er) run time not beneficial if my sensitivity is insufficient to actually do what I need it to do. So, is a higher response categorically better? Also, why did making those changes reduce the response to the degree that it did?
Also, what else should I be keeping in mind to make this method work as well as it can? Since we're not up and running "for realsies" samples right now, I want to get it to as good as place as possible, which is why I'm focused on optimizing things.
This is my first time going through method development. So, am I asking the right questions? Is my head in the right place, more or less?