-
- Posts: 35
- Joined: Wed Dec 21, 2016 12:38 am
I've come across the forum many times in searches for help, and finally joined up to ask about a confounding problem.
My lab has had a new 5977E with 7820 for about 6 months.
We mostly use it for PAH and OPAH analysis at low ppb levels (lowest standard = 2 ng/ml). Peak areas for these ~2000 with RTE integrator.
The sensitivity of the system has always been good, but there is a strange phenomenon I have noticed lately, with the peak areas dropping over time for the same standard/sample.
It's probable that this has been the case since the install, but only lately have we been really testing the %rsd over time.
The brief specs of our method are:
60m HP5 MS UI column
inlet 300
injection pulsed splitless 25 psi
2 ul injection (verified ok by liner volume calc)
solvent dcm
source 300
transfer line 280
oven 60-300 over 65 minutes
We just had a tech out here to check the system, clean the source (I had just done this as well), get the atune to verify, run checkout standards, etc.
The system can get good %rsd (1-3 %) over 24 hrs, no drift, with test compounds like azobenzene and octadecane.
But when we run a labile compound like malathion, or our 16 PAH mix, we get larger %rsd of 5-10 %, with a drift downward that is about -5% as a rolling average.
I haven't run enough injections to see how far the drift will go (down to zero?), but so far about 20-30% over 24 hours is typical.
What are the likely reasons for this type of peak area decrease?
The Agilent engineers are out of ideas for now...
Thanks in advance
