by
lmh » Tue Aug 29, 2017 11:33 am
The debate really comes down to whether the result of a SIM event is reported in "total ions counted" or "ions per second".
If you make a series of ten 1msec measurements and average them, you're reporting "ions per msec", but you counted exactly the same number of ions, in total, as you would have, had you merely carried out a single 10msec event. The effect of SIM dwell time on precision is exactly the same in both cases.
Under the hood, Agilent might make a series of measurements and average them, or count all the ions over x msec dwell-time and divide the result by x; it would be impossible to tell which approach they'd used, because they're mathematically identical. Only an Agilent engineer knows the truth!
It doesn't really matter whether a system reports total ions, or ions per msec, but I have a slight preference for the latter, because it makes comparison of different, related species slightly more intuitive, and the absolute calibration curves are a little more independent of dwell-time. On the other hand, someone else will quite reasonably prefer to see total ions per event, because it reveals better whether the signal is adequate, or whether the dwell times need a bit of shuffling to get better quality measurements.