-
- Posts: 4
- Joined: Fri Feb 01, 2013 6:26 pm
So we've got a normal phase method which we run with a gradient (mp1: Hexane, mp2: Hexane, IPA, EtOH), which we use to detect free fatty acid content in various oils, and I was wondering if there are any issues with non-proportional decrease in sensitivity with decreased LED intensity.
We've basically bottomed out with PMT Gain set at 1, and an injection volume of 1 uL, so the next step, i assumed, would be the light intensity, which was at 100%.
So yes, I want to know if reducing it (anywhere across the whole range, from 1%-99%), would result in a proportional decrease, or if it would be disproportionate. Mind you, all the FFA's come out in one peak...if that matters.
I don't think it would, based on spectrophotometry formulas and whatnot (I/Io, absorbance and transmittance), but a colleague has said that it would decrease unpredictably.
Thanks a lot.
