We've been control charting signal-to-noise (S/N) of a few of our LC-MS/MS analytes in clean calibration standards over the last 2 years (same acquisition method [MRM], electrospray, same concentration level tracked). We use the MassLynx algorithm to measure S/N. The issue is that our charts have more "out of control" points when we plot the raw S/N values, but the graphs show almost no outliers when the common log of the S/N is plotted. The control limits are set at 3 standard deviations.

Is there a chemical or physical explanation (something related to the LC-MS/MS) for why the log of the S/N would be less variable than the S/N itself? Or is this an expected mathematical consequence of changing to log?

Thank you in advance for any help.