by
lmh » Tue Feb 24, 2015 1:55 pm
electron multipliers become less sensitive with use. Their gain depends on the voltage applied to them. To compensate for the reduced sensitivity with use, the instrument will continually increase the voltage to maintain the same gain. The optimum voltage isn't a constant; it is measured during tuning, and gradually increases as the electron multiplier ages.
The reason there is an optimum is that if you increase the voltage further, there will be increased electronic noise, and increased risk of the electron multiplier failing completely (at some voltage, it will fail).
Different instruments and different manufacturers will have different target gains, so don't be surprised if apparently the same electron multiplier works at different voltages in different instruments. This also affects their average life-span in the instrument concerned.
So your question: when to change them? It depends how tolerant you are of failures. You should certainly change them when they reach a voltage that the instrument says is the maximum. You may find that methods are getting unacceptably noisy as they reach higher voltages; if so, note the voltage, and change them when (or just before) they reach this stage. If you find that they're failing altogether, again, consider setting up a regime where you change them 100V before the voltage they reached before sudden failure.
Unless your electron multiplier is very old, and probably maxed-out on voltage, changing it is an expensive option to increase sensitivity; for an average electron multiplier half-way through its life, it might, slightly, or it mightn't at all.