Page 1 of 1
Problem with loss of correlation
Posted: Mon Feb 03, 2014 1:56 pm
by apuetz
Hello,
I am running an Agilent 7890B GC. This is being used to measure the percentage of Ethanol in a solution. When we make our calibration curve the R^2 value is 0.99 up to the 60% standard. However, once we run the 70% standard and above we begin to lose correlation and the values do not fit as nice. Is there a way to fix this or is it just Beer's Law at work?
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 3:09 pm
by Peter Apps
Why (and how) are you using Beer's law in a chromatographic analysis ??
Welcome to the forum.
Peter
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 3:28 pm
by apuetz
Thanks for the welcome!
I wouldn't say Beer's law per se but the same basic concept. I know Beer's law has to do with spectroscopy but is there a concept similar for the GC? What I am doing is creating a calibration curve then testing a sample with an unknown percentage of ethanol. When I create a calibration curve up to 60% the correlation is excellent but 70% and above I begin to lose that correlation. Is there a reason that when my standards reach above 70% that they no longer fall onto the line of best fit.
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 4:31 pm
by tom jupille
Is there a reason that when my standards reach above 70% that they no longer fall onto the line of best fit.
Because that is above the linear range of the method. Either inject smaller volumes or dilute everything appropriately.
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 4:46 pm
by apuetz
Tom,
I'm sorry I may have misspoke. I created a calibration curve using 1%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90%, and 100%. The 1%-60% standards all fell perfectly onto my line of best fit and gave me a correlation value of 0.998. However, when I added the 70%-100% standards to the curve my correlation fell to 0.960 and the points did not fall onto the line of best fit.
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 6:31 pm
by DR
As Tom indicated, your standard set of concentrations is not all within the linear range of the detector. Once you hit or exceed the concentration of the 60% standard, you will have to dilute and adjust your sample calculations accordingly.
Beer's law applies (technically) to UV absorbance and thusly to liquid chromatography where UV detection is used. The same style of external (or internal) standard calculations are typically useful no matter the type of detection employed (except for radioactivity detectors).
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 6:38 pm
by apuetz
Ah ok. I understand now. Thank you all for your help!
Re: Problem with loss of correlation
Posted: Mon Feb 03, 2014 10:13 pm
by James_Ball
If the detector is stable but just not completely linear, you could try to use a quadratic fit to the calibration. But with any calibration quadratic or linear when you are covering such a large range always be sure that it is still calculating accurately at the lowest concentration, that is where you will see the deviation the most.
Re: Problem with loss of correlation
Posted: Tue Feb 04, 2014 4:15 pm
by Peter Apps
Doing headspace of ethanol in water gives a calibration with a kink in it at some point, I don't recall where. It has to do with going from a solution of ethanol in water to a solution of water in ethanol and the interactions between the -OH groups.
Peter
Re: Problem with loss of correlation
Posted: Wed Feb 05, 2014 3:44 am
by DR
Beer's law - not the same as Beer law.
