Agilent Chemstation peak area (response) calculation

Discussions about GC and other "gas phase" separation techniques.

3 posts Page 1 of 1
Hi,

I'm trying to replicate the calculation that Chemstation uses to calculate response.

To simplify things I have integrated part of a peak from one data point to the next. I have calculated the area in units of time (secs) x abundance, but it comes out higher than the measured response (my result is 197622 vs measured response of 176993). I have taken into account the fact that the integration baseline does not start and end at zero abundance.

Any other method I try gives a similar result.

Can anyone shed any light on how chemstation calculates the area?

The manual says that the units are counts x msecs, but if I use milliseconds then my result becomes 1000x too high.
----suffers separation anxiety----
tangaloomaflyer wrote:
Hi,

I'm trying to replicate the calculation that Chemstation uses to calculate response.

To simplify things I have integrated part of a peak from one data point to the next. I have calculated the area in units of time (secs) x abundance, but it comes out higher than the measured response (my result is 197622 vs measured response of 176993). I have taken into account the fact that the integration baseline does not start and end at zero abundance.

Any other method I try gives a similar result.

Can anyone shed any light on how chemstation calculates the area?

The manual says that the units are counts x msecs, but if I use milliseconds then my result becomes 1000x too high.


Do You have GC/MSD?
I wish instrument manufacturers made it easy to figure out what math their software is using!

It depends if you use the Chemstation integrator or the RTE integrator. I did a dive into the macros (shudder) a while ago because I wanted to switch to the latter, but wanted to understand that switch before I made it.

I think the RTE integrator measures area in units of counts*seconds. I think that it really does use that simple equation.

The Chemstation one seems to use counts * milliseconds / 100. I think I remember seeing a note in the macro that led me to believe that the 100 was just a fudge factor to get more manageable numbers. But there is a little more to it. I can't remember if I was ever able to replicate the calculations it does exactly... the small differnce there is may have to do with rounding that happens when you change from scans or minutes to seconds. I do know the macro (enh_grph.mac) switches units at some point (it divides by 60,000, which gets you ms from minutes).

I'm not sure if the OP is still interested, but if so I'll try to post anything else I remember.
3 posts Page 1 of 1

Who is online

In total there are 10 users online :: 1 registered, 0 hidden and 9 guests (based on users active over the past 5 minutes)
Most users ever online was 599 on Tue Sep 18, 2018 9:27 am

Users browsing this forum: BMU_VMW and 9 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry