Advertisement

Problems with LCMS Stability in MRM Mode

Discussions about GC-MS, LC-MS, LC-FTIR, and other "coupled" analytical techniques.

3 posts Page 1 of 1
have been having a lot of trouble with the stability of my mass
spectrometer (triple quad, analyzing in MRM mode).


The instrument was running very well in external and internal modes
originally, but recently runs totally unacceptable in external and
internal standard modes.


Seems to take around 5 hours for the response to stabilize. I ran
xanax/xanax-d5 as a test. After 5 hours, the external and internal
standard deviation values equalled each other. During the first 5
hours, the instrument gave much poorer external standard data.


I think it has something to do with temperature since performs much
better with all panels off. The cooling fan for the RF/DC electronics
was inoperative for about 1 month. Might have something to do with
that problem? Before the fan failed we seemed to be able to do
external calibration analyses.


I was wondering if anyone has a performance test using unlabelled
internal standards and associated statistical data. I wanted to
compare the performance of my instrument to someone elses. The
manufacturer tends to think my instrument is working properly.


I find it very unusual that the instrument repsonse take 5 hours to
equilibrate. I would expect for a well behave analysis, maybe 4-5
injections and the response should be stable.


Thanks for your help.
Sailor

What ionization mode are you using? I've seen a dirty APCI vaporizer tube cause this exact problem (long time to stabilize) on an API-3000.

You didn't say what brand of triple-quad you have. For the API-3000, I usually consider 2% to 8% RSD (external standard) to be okay. Around or above 8% usually means something in the ion source is dirty, or something is wrong with the method or autosampler. The API-4000 is normally a little better, perhaps 1% to 6% when things are okay.

i had cleaned everything (electrospray ionization). After about 5 hours, you get a standard deviation of about 1% by either external or internal standard. However, during the first 5 hours, sensitivity usually changes 10-15%.

If labelled standard, drift tends to cancel out, but if external (tried 5 different internal standards), some cancel out better than others, but still 5-6% error.

With covers open on instrument and fan blowing on electronics, levels out to 1% in about 20-30 minutes, and then 1% standard deviation.

Was hoping someone had a test with statistics over 8-16 hour period that I could compare with.

Thanks for your thoughts.
Sailor
3 posts Page 1 of 1

Who is online

In total there are 23 users online :: 0 registered, 0 hidden and 23 guests (based on users active over the past 5 minutes)
Most users ever online was 4374 on Fri Oct 03, 2025 12:41 am

Users browsing this forum: No registered users and 23 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry