-
- Posts: 3
- Joined: Wed Jan 04, 2012 7:16 pm
My problem is that despite autotuning just fine and achieving expected relative(isotope) and absolute abundances (69:219:502, 1:4:10, ~400k:400k:40k), when a sample is run the high mass sensitivity is at least an order of magnitude lower than on a typical 5973. The tech who did the PM says it should be an order of magnitude higher.
My first impression of the poor sensitivity was that the chromatography looked like MS/MS data. I knew there was something wrong, but wasn't on site to investigate. After investigating onsite, the only problem I could think of to cause this was the column sticking into the MS too far, but that's apparently not the case. The chief head-slap cause is an airleak, which may have been an issue previously, but isn't at present. The filaments are new. The thing was cleaned and reassembled recently (I saw it, everything looked fine, excepting the extra washer on the repeller, which was new to me), and the current behavior is consistent with what it has been for some time. The symptoms (poor sensitivity, esp at high masses) have appeared in CI, EI, S/SL, PTV modes. And, we run this thing primarily in SIM.
I've played with the PTV inlet program and increased sensitivity a bit, but it's across the board (all masses) still an order of magnitude lower than the 5973. It seems to be an MS issue, although some symptoms (mainly the good tune) point toward the GC.
Anyone seen anything like this? Is there some adjustment with the triple axis detector that needs to be addressed? I like the repeller as a cause--or maybe some poor lens alignment, but can't find anything that points to ...anything. Help?
