Page 1 of 1

baseline changes due to attenuation change

Posted: Wed Jan 04, 2012 9:26 am
by soak
Hi peeps,
Hopefully someone will be able to shed a bit of light on a problem that has recently started on one of our machines. It is a Perkin Elmer Autosystem with packed column, split to an ECD and FID with methaniser. Occasionally (and completely randomly) we are getting a drop in the baseline when the attenuation changes during the run, but normally we get a slight increase in baseline. The problem this causes is that the peak of interest (after the attenuation change) disappears as it is now below the baseline (the baseline drops to 0mV).
I've never seen this problem before and it has only just started happening, but nothing else has changed in the setup of the instrument (unless someone has messed with it behind my back!).
Thanks in advance
S

Re: baseline changes due to attenuation change

Posted: Wed Jan 04, 2012 1:29 pm
by GasMan
I would say that this is normal for any system where the baseline signal is not at zero, that a change in attenuation will give a change in the baseline. I do not know the Perkin Elmer system, but is it possible that there is an 'command' that allows you to autozero at the start of the run, and somebody has turned this off.

Gasman