By Stephanie on Friday, August 13, 2004 - 02:39 pm:

I am relatively new to the world of GC and am currently trying to determine the linearity of our method using ECD. After running a series of standards, I find that the instrument does not actually calculate the amount of the standards which are higher than the highest one used in the calibration curve. I am using an agilent 6890, with quadratic fit. Is it typical under these circumstances that a calculated amount is not given if the area count is higher than that of the highest calibration standard? I ran a similar set of experiments with the FID and did not encounter this problem at 5x the concentration of the highest calibration standard.

-------------------------------------------------------------------------------------------------------
By carla on Sunday, August 15, 2004 - 02:28 pm:

Extrapolate your cal-curves isnt allowed. Maybe with a linear curve fit, but with quadratic fit? How should that work?
I dont think that you got good results with the FID, if your curve was also a quadratic.
Try to run a two-point calibration and tell the software: "quadratic curve fit", its the same, it doesnt work.

-------------------------------------------------------------------------------------------------------
By Manuel on Monday, August 16, 2004 - 04:42 am:

In the case of ECD it is well-known that this Detector isn't linear over a wide concentration range ! Using the ECD you must calibrate at a lower range of concentration and strictly not extrapolate your cal-curve ! Then you can use a linear fit in software !

-------------------------------------------------------------------------------------------------------
By Stephanie on Monday, August 16, 2004 - 11:28 am:

Well, that is what I figured...just needed confirmation. Looks like the analysis just got that much more difficult