Advertisement

Qualifying software integration and regression analysis

Discussions about chromatography data systems, LIMS, controllers, computer issues and related topics.

8 posts Page 1 of 1
Does anyone do this by printing a paper copy of the chromatograms, cutting out the peaks, weighing them on a balance, and entering the weights into Excel for regression analysis? Given that a mass spec might be linear over 5 or more orders of magnitude, and paper weights are probably good for at best one order of magnitude, I am trying to figure out whether this is a legitimate qualification of an instrument's functional capability. I won't say in which EQ Plan I saw this... :?
All standard disclaimers apply. My posts are my opinions only and do not necessarily reflect the policies of my employer.
No!

PS - UV detection is frequently good to 7 orders of magnitude...
Thanks,
DR
Image
Does anyone do this by printing a paper copy of the chromatograms, cutting out the peaks, weighing them on a balance, and entering the weights into Excel for regression analysis? Given that a mass spec might be linear over 5 or more orders of magnitude, and paper weights are probably good for at best one order of magnitude, I am trying to figure out whether this is a legitimate qualification of an instrument's functional capability. I won't say in which EQ Plan I saw this... :?
The last time I heard of this type of analysis being suggested was when HPLC chromatogram generation was transitioning from chart recorders to computing integrators. that was back in the 80's.

Nowdays many suppliers provide test data where you can use supplied data, process it and compare the results generated by your instrument with the set provided by the manufacturer.
Good judgment comes from bad experience, and a lot of that comes from bad judgment.
There's a few approaches to verifying integration methods within Chromatography Data systems;

1) Computationaly generated test data ; Get some test data files in some generally readable format (like ANDI) that have known features like area, height, etc. Import into your system and compare

2) Electronically generated test data; There was at least one manufacturer of a chromatography test signal generator available ~10 years ago, that would produce a test signal with known parameters

3) Acquire / swap the same data in your system and some other system you trust and compare the results

4) Talk to the vendor, convince yourself they understand the issue, and then assume they got it right enough for your purposes

For more info you might like to read;

Dyson, N. 1990, Chromatographic Integration Methods Royal Society of Chemistry.
Papas, A. N. & Tougas, T. P. 1990, "Accuracy of Peak Deconvolution Algorithms within Chromatographic Integrators ", Anal.Chem., vol. 62, no. 3, pp. 234-239
Dyson, N. 2006, "Peak distortion, data sampling errors and the integrator in the measurement of very narrow chromatographic peaks", Journal of Chromatography A, vol. 842, pp. 321-340.
Di Marco, V. B. & Bombi, G. G. 2001, "Mathematical functions for the representation of chromatographic peaks", Journal of Chromatography A, vol. 931, no. 1-2, pp. 1-30.
Le Vent, S. 1995, "Simulation of chromatographic peaks by simple functions", Analytica Chimica Acta, vol. 312, no. 3, pp. 263-270.
Sekulic, S. & Haddad, P. R. 1988, "Effects of peak tailing on computer optimisation procedures for high-performance liquid chromatography : I. Characteristics of tailed peaks under optimisation conditions", Journal of Chromatography A, vol. 459, pp. 65-77.
...and pretty much anything written by John Dolan....
[url=http://www.paulhurley.co.uk]Paul Hurley[/url] [img]http://www.paulhurley.co.uk/avatar.gif[/img]
This might be an aside, but does anyhow know where to find such test data? I'm working on a couple different integrators and while I could generate some ANDI files with gaussian peaks/simulated noise/etc, I'd prefer to check against a standard that's been run against other peoples' integrators too.
A standard data file would be nice--not sure if the vendor has such a thing for this particular instrument (an ICP-MS). Certainly the software looks like it can only import vendor-specific data files. Guess I need to make a phone call.
All standard disclaimers apply. My posts are my opinions only and do not necessarily reflect the policies of my employer.
Ooh, I'd be really scared of a standard data-set with an expected outcome. I'd worry that software developers might end up optimising their integrators so they handle the standard well, but not reckoning on how well the integrator handles everything else. At the very least you'd need a recognised suite of chromatograms covering the vast range of defects that can happen in real data.

Personally I rely on the vendor to have chosen a valid integration algorithm, and implemented it correctly. I concentrate on making sure that my methods aren't the weakest link in the chain. But I'm not in a regulatory environment so I don't actually have to comply with anything.

The assumption that vendors write their software correctly is possibly slightly dubious; about 10-15 years ago, I was using a UV/Vis spectrophotometer that could draw a regression line through a set of points to calculate rate of change of OD. It also calculated the error on the slope. We (me and co-worker) noticed one day that if we set up the cuvette changer and used a very short piece of slope, such that there were only two measurements of OD, there was still a finite error on the slope, even though there's only one possible value for what the slope can be! This was clearly a mistake. I'm afraid we did something back then that would today probably be considered illegal: worked out the file-format of the raw data file. It turned out to store about 5 evenly-spaced "measurements" per actual physical measurement that the instrument made, the rest being interpolated (the cuvette wasn't physically in the light-beam for most of the time, so the data must have been interpolated rather than measured). You could see this on screen, actually, because if you quickly diluted the cuvette while it was out of the light-beam, instead of jumping to the new OD, the trace on screen would oscilate a bit before and after, presumably as the interpolation algorithm smoothed its way through the jump. I believe what happened was that one part of the software team wrote the data-collection bit, and put in the interpolated points. Another team wrote the statistics of slope-determination, and didn't realise that the first team had stored non-measured data. I don't know how common this sort of thing is in modern manufacturers?

Cutting out paper chromatograms: yup, seen it done, but completely unnecessary in the post chart-recorder era. Daftest one I ever saw was a scientist who was measuring leaf area by photocopying leaves and cutting out the shapes from paper. To increase accuracy, he suggested to his students that they should set the photocopier to enlarge from A4 to A3. Amazed, the students found this exactly doubled the weights. But since scientist was sceptical (this is the daft bit), he made them spend most of a day doing this on a range of geometrical shapes (squares, circles, stars, triangles, L-shapes, the lot) to convince himself the effect was independent of shape. Experimental geometry in action...

Integration has been around for so long, and is so central to any chromatography system, that I'd be very surprised if major manufacturers are getting it seriously wrong.
No, you don't need to weigh paper. Your validation should normally follow a plan of DQ/IQ/OQ/PQ, and the extent of thses depends on the degree of customisation of the software (see GAMP5 for this sort of thing). Typically chromatography software from the big boys has the DQ bit already done and you can do a combintation of supplier audit (easy desk-top job) and certificates from them to prove this. They often also have automatic IQ and maybe simple OQ tools. This leaves you to put your validation efforts into the bits you've customiseed yourself i.e. system policies, custom feields, reports etc. so no, integration and the in-built quantification stuff doesn't need ridiculously intense validation. Equally you don't need to validate everything, use arisk-based approach to cut the million functions down to something more sensible.

It is useful to kep a data set from the validation or other work to use as an OQ trial data set, after you apply patches etc. but you are only going to compare your results to previous. Waters for example have an automatic IQ/OQ tool and there you would expect the same results on any installation of course, that's a bit different.

Be wary using Excel for proving something else as you might need to validate excel! You can do regression analysis on paper if you really want, it's not too onerous and you'd only need 3 points to show it works.

hope that all makes sense?
Where can I buy the kit they use in CSI?
8 posts Page 1 of 1

Who is online

In total there are 11 users online :: 0 registered, 0 hidden and 11 guests (based on users active over the past 5 minutes)
Most users ever online was 4374 on Fri Oct 03, 2025 12:41 am

Users browsing this forum: No registered users and 11 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry