-
- Posts: 1890
- Joined: Fri Aug 08, 2008 11:54 am
So far as I can see, these are constant flow methods with gas-saver, and during the temperature gradient as the pressure increases to keep the flow constant, there comes a point where it can't rise further, the flow falls, the target and actual pressures differ, and the error happens
Our system is probably short on gas-pressure generally (see below!).
I've got a couple of questions:
(1) If we switch off gas-saver, the problem doesn't happen, and the pressure rises much higher than it does with the gas-saver on. I don't understand: why should decreasing the gas flow (by using the gas-saver) make it harder for the instrument to maintain the necessary pressure at the start of the column? I'd be very grateful if anyone can point me at an explanation.
(2) Our system is on a gas line that has a 4-bar maximum pressure. Agilent have said we should have 5-8 bar. Clearly our system has been working for years at the substandard pressure. What pressures do people generally use/get-away-with in their labs?
Obviously something has changed in our system that means the methods aren't working any more. Perhaps the filter on the gas line is getting old, I don't know?? I'm wondering if these were methods that were living on a knife-edge and have just fallen over. Any advice gratefully received.
