by
Ron » Fri Apr 23, 2010 1:11 pm
There is no simple answer to your question. The first thing to be aware of that sensitivity in GCMS is not just peak height, it is signal to noise ratio. If you have one system with a signal of 1000 counts and an average noise level of 10 counts, and a second system with a signal of 10,000 counts with the same sample, and the average noise level of 500 counts the first instrument is more sensitive even though the signal is a factor of 10 lower.
The major factor in increased sensitivity in SIM mode in modern MS systems is decreased noise, not increased signal. Modern electronics are much better than those of 10 or 15 years ago, and there is not much increase in signal, the major factor is reduction in the noise level. The most common detector is an electron multiplier, and this supplies an instantaneous measurement, any time averaging or accumulation must be done in the firmware. There may be a multiplier applied to the signal to change scaling, but as a rule the signal will not be larger in SIM mode than in scan, the background noise will be lower, leading to an increased signal to noise ratio.
Older instruments with less advanced, higher noise electronics may have a higher signal level, but the noise is also significantly reduced, and the reduction in noise is a major factor in the increased signal to noise ratio.
The sensitivitly increase in a newer mass spectrometer running SIM mode will probably be less dramatic than in an older instrument. instead of a 100 times increase in sensitivity you are more likely to see a 5 to 10 times increase in sensitivity.