Advertisement

ASTM Noise - Agilent 1100 RI

Discussions about HPLC, CE, TLC, SFC, and other "liquid phase" separation techniques.

13 posts Page 1 of 1
So we are running through an OQ procedure and part of that is a noise test using 15 minutes of baseline and a drift test using 60 minutes.

When I use Chemstation to calculate the baseline noise I am getting at best about 3.0 nRIU. The specification in the manual is +/- 2.5 nRIU, which I think is a pretty standard number for RI detectors.

However, does +/- 2.5 nRIU mean a total of 5 nRIU? Is Chemstation reporting the total noise or the +/- value? I've done the ASTM method by hand before, and I always thought that the noise value is the total noise, meaning the distance between the top and bottom segments with which you enclose the baseline.

The conditions I'm using are from the manual, pumping water at 1 mL/min through a restriction capillary.

Worst case, if the noise value I'm getting is +/- 3.0 nRIU and it does not pass the spec, what's the easiest way to justify to quality? An instrument doesn't always have to meet factory specs, am I correct?

If it's an Agilent detector, I would hope that Agilent's spec for noise is based on however Chemstation does it. That would have to be confirmed by someone at Agilent :wink:

If the detector is new, and it is out of spec, then it's Agilent's problem. If it's been in service for a while, then you need to look at how it performed the last time it was qualified.
-- Tom Jupille
LC Resources / Separation Science Associates
tjupille@lcresources.com
+ 1 (925) 297-5374

If I understand you correctly, your total noise is 3 nRIU. The spec of the detector is +/- 2.5 nRIU, which is, as you concluded correctly, 5 nRIU. Since 3 is less than 5, your detector is functioning perfectly...
Worst case, if the noise value I'm getting is +/- 3.0 nRIU and it does not pass the spec, what's the easiest way to justify to quality? An instrument doesn't always have to meet factory specs, am I correct?
If you are following the Agilent IQ/OQ procedure, it's likely to set the Chemstation to the correct calculation for noise.

Unless you have a documented qualification procedure that has more generous limit than Agilent's, your instrument is no longer qualified, and all data from the last qualification may be treated as suspect - depending on the quality environment you work in.

If you're in the pharmaceutical industry or ISO 17025 environments, you'll be aware that system suitability assumes that your instrument is qualified, so you can't use any method system suitability tests to assert the instrument is OK.

At the very least, your Quality group should be requesting an investigation, and probably will want demonstrated OQ compliance before releasing the instrument from the quarantine of a failed OQ.
The hard part is showing that previous results ( after the last good qualification ), are OK,.

It's probably best to talk to Agilent servicepeople who are experienced with your detector to identify the cause of excessive noise ( and confirm the calculation ), as it could just be low lamp energy or dirty refractometer.

Bruce Hamilton

Thank you for the thorough replies, gentlemen.

We have had this intstrument for 5 years and have used it for research only. We have just started putting in a general OQ/PQ procedure for our HPLC instruments as we are entering Phase I clinicals, so there is no prior OQ, nor even a prior noise test, with which to compare. At best I could go back to some old chromatograms from various experiments.

The OQ procedure that we wrote is to compare the measured noise by ASTM to what the detector performance specification is in the manual. As of yet, we have no SOP methods using the RI detector that place a specific demand on the signal to noise ratio, so the manual spec is the only thing we had to go by.

Bruce, this is not an OQ procedure supplied by Agilent...we simply run under the conditions listed in the manual and measure the noise with Chemstation. Is setting the spec to what is in the manual standard practice, or is that only the ideal case when the instrument is brand new?
I've run this procedure on other, older instruments including a Waters 2410 and Shimadzu RID-6A and was able to meet what was in the manual (I know...that probably answers my question).

Uwe, to your point, I don't understand why they report the value as +/- 2.5 nRIU. I've done the ASTM method graphically before. There is nothing that says to divide it by 2 and report as a +/- value, so I will have to clarify with Agilent why they report that way. 2.5 nRIU total noise seems a common spec for RI detectors.

We have had this intstrument for 5 years and have used it for research only. We have just started putting in a general OQ/PQ procedure for our HPLC instruments as we are entering Phase I clinicals, so there is no prior OQ, nor even a prior noise test, with which to compare. At best I could go back to some old chromatograms from various experiments.
...
Bruce, this is not an OQ procedure supplied by Agilent...we simply run under the conditions listed in the manual and measure the noise with Chemstation. Is setting the spec to what is in the manual standard practice, or is that only the ideal case when the instrument is brand new?
OK. The following are only my opinions, and others with more experience may offer different perceptions. I hope it's relevant to where you're headed....

1. It's important that all your qualifications are performed under the agreed overall quality system, eg protocals signed off in advance, and testing data signed off and released by Quality. One of the major expletives in quality systems is "retrospective", so try to avoid it...

Don't qualify equipment, other than for your peace of mind, without prior quality assurance signoff of the protocols, as the document won't be acceptable to auditors.

2. An instrument specification is a specification, so... unless the specification has a age/use related component ( not very common ) the original manufacturer's performance specification is used throughout the instrument's working life - unless the equipment has been modified/upgraded, which has to also be well documented, and there must still be specifications, as quality expects pass/fail limits.

Manufacturers service and support agreements commit to keeping the instrument within the initial specification for the agreed period, provided you don't misuse the instrument, and keep to the maintenance programme etc. If an instrument can't meet the specification, it's supposed to be quarantined until repaired/replaced. Unlike for humans, euthanasia is often the best solution for non-performing analytical toys in reegulatory environments.

3. It's possible to have bespoke instrument specifications. Auditors get seriously nervous twitches ( to accompany the drooling when they find a non-compliance ), whenever a specification is different to the manufacturer's.

The minimum energy path is to use the manufacturer's specifications, and omitting any of the manufacturer's recommended qualification tests can also trigger auditor spasms....

The documentation for any different specification has to be much more rigorous - explaining why the more relaxed limit is OK, and showing that the instrument isn't going to be used for other assays when the manufacturer's specification would be applicable.

4. As your instrument has enjoyed a varied and interesting life, don't bring that to the attention of auditors. Any evidence of quality infidelity would be seized upon, so don't mention it. Just start all the logs with a note that instrument was transferred from research status on xx date.

Have your quality people set a programme start date, and ensure all of the instruments are fully qualified, with all aspects of quality systems ( trained staff, instrument maintenance, documentation change control, controlled access, completed instrument qualifications etc. etc ) in place by that date.

It would be highly advisable to at least review, and document, the Agilent OQ/PV procedures, and also make sure that the current installation complied with critical reuirements of the Agilent IQ. There are WWW firms that offer " DIY " qualification documents/systems for most popular HPLCs.

5. You may find it cheaper to avail yourself of an external instrument service/qualification contractor - partly because any maintenance etc. is supposed to be performed by trained/skilled people using calibrated equipment. One of first things auditors look at is the training records of people using or repaining equipment.

6. Quality systems for regulatory compliance are resource intensive and expensive, but the cost of non-compliance is much, much greater...

7. Ensure your managers know they will be held accountable by auditors if appropriate resources were not provided when requested.
Don't try to save money by performing tests/maintenance without all the correctly-calibrated equipment, trained staff, and quality systems, aas you will held accountable.

If you want to train your managers, go to one of the sites like www.labcompliance.com and review some of the FDA-483 extracts applied to analytical facilities. A year or so ago, an FDA inspector noted that laboratory testing was the richest mine for their auditors.

Please keep having fun,

Bruce Hamilton

Noser,

Bruce gave some good recommendations/ideas.

I would add one more: it is important to know how the noise is being determined. Ask Agilent how they determine the S/N value. You are correct that using "+/-" is unusual. It may be that they measure noise as peak noise, which can be both positive and negative. And that means that +/- 2.5 Np is 5 Np-p.

To learn more about noise measurements, there are several other discussion threads in this forum. Here is one:
http://www.sepsci.com/chromforum/viewtopic.php?t=7290

Regards,
Dan

Thanks again for the replies. That was some very useful info, Bruce. Just to reply about our protocol, it has been signed off by Quality; this is our first official run.

Also I called Agilent and they said that the the noise spec as listed is for above and below the center of the baseline, hence +/- 2.5 nRIU is 5.0 nRIU total. I just find that a funny way to report it for the ASTM method.

Also I called Agilent and they said that the the noise spec as listed is for above and below the center of the baseline, hence +/- 2.5 nRIU is 5.0 nRIU total. I just find that a funny way to report it for the ASTM method.
Just to be safe, I'd still suggest trying to have a look at the Agilent Qualification documents, just to ensure that +-2.5 isn't the limit, eg -0.4 and +2.6 ( depends on the nature of the noise ) could be outside such a limit.

The qualification protocols for their detectors should describe the calculations ( not specific to Chemstation results ) and the duration and number of any calculations, such as breaking the data into time segments. They do show that data for the UV detectors I have.

Please keep having fun,

Bruce Hamilton

Bruce, I don't get it... Noise goes up and down to the same degree, i.e. it is by definition symmetrical. How can you have a noise that has a -0.4 and +2.6 feature? Isn't something like this a very unusual construct for standard noise?

Bruce, I don't get it... Noise goes up and down to the same degree, i.e. it is by definition symmetrical. How can you have a noise that has a -0.4 and +2.6 feature? Isn't something like this a very unusual construct for standard noise?
The detector module output is the sum of all the noise sources and data processing, and I agree it's very unusual for unsymmetrical noise output.

I don't have immediate access to the ASTM method for RI detectors ( E1303 ), or to the Agilent qualification protocols for the RI detector, so can't even comment whether it's feasible in this case. If the ASTM method offers calculations/definitions for both +- or P-P, the whole issue disappears.

Auditors are likely to expect a documented confirmation of compliance, without assumptions, and a phone conversation with Agilent wouldn't cut the mustard with most of the auditors that I've encountered.

If Agilent have specified +-, rather than peak to peak, and it's not in the ASTM procedure, it could be their in-house testing procedure, or just a marketing strategy to have a lower numerical value in the specification - in which case applying a factor would be valid, and should be in their qualification documentation.

In my experience, auditors would expect to see the Agilent qualification protocols, which usually explain the noise calculations and define instrument settings for the test, and they could be concerned if an unattributed factor was applied that moved a numerical value into compliance.

If I was in this situation, for peace of mind about future audits, I would obtain the Agilent qualification protocols, so that I understood how the Agilent limit was defined, and would also have clear documentation explaining why applying the factor was valid.

Please keep having fun,

Bruce Hamilton

All this quibbling about the sensitivity of the detector seems unimportant to me. Surely the main thing is what is the signal to noise ratio of your peaks of interest.

I realize that in some industries qualification is now a requirement - to me it's mostly a waste of time. Instruments that are perfectly satisfactory can spend months waiting to be used while all the qualification documents are generated and signed off by numerous people.

What is important to me is how the instrument operates on the day it is used.

If an instrument has passed all the tests it can give a false sense of security.
No Tswett

I understand that for most applications some parameters are not crucial. However, quality relies on a series of topics.

When you purchase an item, you must be sure it meets the manufacturer specifications and that the manufacturer specifications meet your own requirements. If you have documented your purchase criteria based on specs, you have a good start.

Then, before you install the equipment, you first need to assure your site meets the equipment requirements - otherwise any results may be false.

Now that you've done all that, it's time for testing the equipment. It must meet your requirements (if you haven't defined them before purchasing, you should assume the manufacturer specs). The protocol should be defined and approved before testing. If it passes, you are ready to use it.

If you haven't done none of the above and have used the equipment for some time, it could be interesting to perform a thorough maintenance on the system before qualifying it. Remember that noise might come from several sources, including pump, column oven and the detector itself, which might have became dirt at some point. I also assume you have warmed up the detector - RIDs tend to be noisy and drifty when freshly turned on.

AdrianF is correct, the most important thing is how the instrument performs on the day it is used. That's why you must check it often and run a system suitability, performance qualification - you name it. But again, you can't be 100% sure it is working perfectly if you don't test it against standardized protocols.

Cheers!
13 posts Page 1 of 1

Who is online

In total there are 18 users online :: 0 registered, 0 hidden and 18 guests (based on users active over the past 5 minutes)
Most users ever online was 4374 on Fri Oct 03, 2025 12:41 am

Users browsing this forum: No registered users and 18 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry