Page 1 of 1

Baseline Drift & Noise

Posted: Thu Feb 24, 2005 12:54 am
by Sallybeetle
I have searched many references for procedures and specifications for detector noise and drift.

The general consensus seems to be equilibrating the system in 100% degassed water (deionized, WFI, etc.) and recording the trace over 5 minutes - 1 hour. Noise = difference between the maxima and minima every 30 seconds. Drift = absorbance maxima and minima difference over the 5 minutes to get AU/minute or AU/hour.

Are there any better procedures around?

Also, I can not seem to find any industry standard specifications for these tests. Since we have several different types of HPLC systems, manufacturers, and detectors, I do not think it wise (or necessary) to go by the manufacturer's tight specifications. I would like to assign a generic, reasonable specification for all the systems to calibrate against.

Posted: Thu Feb 24, 2005 7:44 pm
by Mark Tracy
First, you may want to check ASTM. (I would, but www.astm.org is inaccessible right now.)

Most chromatography data systems have a built-in algorithm for calculating noise and drift. If your facility has standardized on one software, just use (and document) that as your algorithm. If you have a few odd instruments, you can export/import the data to your usual software for the calculation.

For instance, the Dionex software computes the regression line for a section of baseline. The slope is the drift and the noise is the sum of the maximum and minimum excursions from the line. Other software uses the RMS.