Page 3 of 4

Posted: Wed Aug 08, 2007 1:10 pm
by Uwe Neue
Until now, I have not participated in this discussion, because this is not my area of expertise. But now I am starting to wonder...

Noise of a detector is affected by multiple elements. One is the path length. If you increase the path length by a factor of 2.5, wouldn't you expect that in low light conditions the noise also increases by some factor around the same value.

When we compare the noise of different detectors, the sampling rate plays a giant role. The standard sampling rate on an HPLC detector maybe around 1 datapoint per second, on the UPLC, you can go to 80 datapoitns per second. It appears that you are measuring on the UPLC with a rate of 20 datapoints per second. A rather standard HPLC rate is 1 datapoint per second. This would result in a roughly 5 fold difference in noise.

If both of these things are true, I get an order of magnitude difference in noise...

Posted: Wed Aug 08, 2007 1:36 pm
by Victor
Mattias- I do not know the data gathering rates of the Agilent PDA detector that you are using. Can you find this out? Here are the data gathering rates of the Agilent variable wavelength detector. These are set through the peak width control on the data station:

PeakWidth (min) ResponseTime (sec) Signal DataRate (Hz)
<0.005 <0.12 13.74
>0.005 0.12 13.74
>0.010 0.25 13.74
>0.025 0.5 13.74
>0.05 1.0 6.87
>0.10 2.0 3.43
>0.20 4.0 1.72
>0.40 8.0 0.86


You can see that at least on the VWD that the fastest data gathering rates are not too far off 20 Hz, but I am not sure if the 1100 PDA matches these figures.

If you increase the path length of a cell, you will increase the signal by some factor as well as increasing the noise. Therefore I am not sure the signal/noise ratio, which is the crucial issue, will vary that much by changing the path length of the cell. But I am only guessing at this.

Posted: Wed Aug 08, 2007 1:36 pm
by danko
But Mattias experiences 10 times more noise on the Acquity compared to the Agilent 1100 (i.e. 2 mAU and 0.2 mAU respectively).
Also, who says that the Agilent was set to collect 1 point per second? It can go up to 40, so we need this info, in order to evaluate the effect of the sampling rate.

Best Regards

Posted: Wed Aug 08, 2007 1:45 pm
by Victor
Danko- you are right.

The figures I gave were for the Variable wavelength 1100 detector which as you can see goes up to 14Hz.

The newer Agilent detectors (1200) I believe work up to 80 Hz. We need the setting that Mattias has used on what I have assumed is an Agilent 1100 PDA

Posted: Wed Aug 08, 2007 1:47 pm
by danko
Sorry Victor. I ask the same question as yours.
Your post must have come while I wrote mine.

Best Regards

Posted: Wed Aug 08, 2007 1:53 pm
by Mattias
I went down in the lab to check the Agilent methods, and the sampling rate is set to "auto" in all methods (we use Chromeleon software for all instruments).

I guess that means that the sampling rate is dependant if there is a peak eluting or not? I must admit that I was not aware that the sampling rate could influence the noise level.

Posted: Wed Aug 08, 2007 7:45 pm
by Bruce Hamilton
I can't really contribute to the noise discussion, but my basic philosophy is that manufacturers' have a reasonable number of clues, and design their systems for typical useage. If this is a standard technique, Waters probably would/should have covered it.

If the noise is tested at low and high solution absorbance at different wavelengths, then Waters should be able to identify if the detector is out of specification. I would have though their service person would have performed that test early on. Maybe similar data could be obtained for the Agilent detector as well.

The 1100 DAD specification ( using ASTM protocols for short term noise ) is +-0.01mAu at 254 nm and 700nm ( 10mm cell, 2 sec response time, 4nm slit width, 1 ml/min MeOH ). The 1100 VWD is +- 0.0075mAU under same conditions. 2 seconds is rather long, but different settings could be used to evaluate effect of time slice.

I'm confused by the higher noise at 270 nm, as a D2 lamp should have much higher energy in that region. I'm starting to wonder if the problem more to do with the software or data handling, but testing the detector should be the first choice.

It would have been quite nice to know what the mobile phase UV absorption was at 220nm, but if it's not relevant, please don't waste time on it.

Please keep having fun,

Bruce Hamilton

Posted: Wed Aug 08, 2007 9:11 pm
by unmgvar
Mattias,

in the first posting you said that you were having a Waters tech "sleeping over", so you must have done the basic tests of drift and noise on the instruments. what did you get?
did the tech need to do it several times in order to pass the test? did he do something in order to get the test passed?
for example in our SOP for these tests we use water and HPLC grade water from different sources give very different results.

the test is of course done under specific speed rate of the detector. what are the test specs for the detector. it can give you a rough idea of where you stand with your aplication compare to the supplier test

Posted: Wed Aug 08, 2007 9:25 pm
by danko
I went down in the lab to check the Agilent methods, and the sampling rate is set to "auto" in all methods (we use Chromeleon software for all instruments).
I’m not familiar with Chromelion, but I guess “autoâ€

Posted: Thu Aug 09, 2007 5:51 am
by Mattias
The detector has always fullfilled the Waters specification. When we still were complaining they exchanged the entire detector, but same behaviour. I have not digged into the test that they are performing.

The Acquity is the only instrument that is not controlled by Chromeleon, but by Masslynx. I can see that Chromeleon has a detector parameter called "Average" which is always set to "On". That seems to be a noise reducing function.

I think I need to work more systematically with this problem to really sort it out. I have got a lot of good suggestions from all of you, thank you very much! I let you know what comes out of this.

Posted: Thu Aug 09, 2007 5:10 pm
by Mark Tracy
On a Dionex instrument, a data collection rate of auto means that the sample rate is adapted to the rate of change of the signal value. I don't know about anyone else's hardware. Personally, I try to avoid that setting. Newer versions of Chromeleon have a wizard to optimize the detector's data collection rate and filter constant.

The filter constant (rise time or time constant) can have a large effect on noise, especially if the noise is short-term.

Posted: Thu Aug 09, 2007 5:37 pm
by Bruce Hamilton
My limited understanding of the Agilent 1100 DAD is that sampling rate is constant, and determined by initial DAD settings ( with a guide that suggests the sampling frequency for expected peak width ). That suggests the signal intensity does not affect the sampling rate.

I suppose the higher frequency systems ( eg 1200 ) might change it during the run to reduce data size, but that could be dangerous for gradient systems.

The peak width setting ( which is part of data processing ) does not affect the sampling rate on the 1100 DAD.

Please keep having fun,

Bruce Hamilton

Posted: Thu Aug 09, 2007 6:53 pm
by danko
Hi Bruce,

What is the sampling rate in “autoâ€

Posted: Thu Aug 09, 2007 9:29 pm
by Bruce Hamilton
My understanding is that the setting is determined by the type of chromatography, and Agilent says it also depends on what signal-to-noise you want to tolerate. I think 5 Hz may be default, with following choices available for the 1100 DAD.

Peak, Response, Data
Width, Time, Rate
(min ), ( Sec ), ( Hz )
0.01 , 0.2 , 20
0.02 , 0.5 , 10
0.05 , 1.0 , 5
0.10 , 2.0 , 2.5
0.20 , 4.0 , 1.25
0.40 , 8.0 , 0.6
0.80 , 16.0 , 0.3

They recommend setting the detector to the peak width of a narrow peak of interest, which should give response time of 1/3 of peak width, producing <5% reduction in peak height, and <5% extra peak dispersion.

Decreasing PW from the recommended setting will provide <5% incease in peak height, but baseline noise increases by a factor of 1.4 for every factor of two reduction in response time.

Increasing RT by a factor of 2 from the recommended setting will reduce peak height by about 20% and reduce baseline noise by a factor of 1.4. The peak resolution may decrease, but signal-to-noise would be optimal at that setting.

Note that it's confusing they refer to peak width in setting up the detector, and also when integrating data, but my understanding is that the initial detector setup defines the actual sampling rate, which presumably is fixed for the duration of the run.

Please keep having fun,

Bruce Hamilton

Posted: Fri Aug 10, 2007 10:31 am
by Mattias
Hi again,

I have now performed some new tests:

1. I stopped the flow and measured the noise, and it turned out to be identical as when the flow was on (about 2 mAu at 280 nm)

2. I played with the two parameters that were possible to change: sampling rate and time constant. The were changed from 5 - 20 Hz and 0.2 - 2 sec respectively. They had a great impact on the baseline, but I need to see the influence on the S/N. (Wavy baseline at high time constant)

3. I changed the flow cell to a 10 mm cell, and voilà the baseline noise was reduced to about 0.5 mAu (with or without flow). It is still about double the noise I get on an Aglient or Alliance.

I wonder if the 25 mm flowcell can be dirty? Can a dirty flowcell cause high noise when the flow is off? I guess it has to be extremely dirty to block all the light.