Page 2 of 5

Posted: Fri Apr 22, 2005 11:46 am
by HW Mueller
Weighting doesn´t account for nonrandomness (curvature in my example) either.
(I had the taste of fudge last time I bothered with weighing, don´t remember why...... thus the parentheses).

Posted: Fri Apr 22, 2005 1:00 pm
by DR
"If your experiment needs statistics, then you ought to have done a better experiment".
hehe -
...lies, damned lies & statistics -Mark Twain
When in doubt, r² coupled with a minimum of either 3x or 5x your noise level for LOD - LOQ determinations is a good (practical, if not necessarily theoretical) approach. Basically, you figure out your S/N, multiply by 3 or 5 and specify that as your minimum LOQ for trace analyses and your minimum LOD for more conservative calculations. It is more objective than a visual assessment.

Posted: Mon Apr 25, 2005 6:49 pm
by Fernando
Hi

Look at "Preparation of Calibration Curves" in the VAM site!

hope it helps :D

Posted: Tue Apr 26, 2005 12:59 am
by syx
Fernando, I got the statement that: correlation and linearity are only loosely related, the coefficient r is a measure of correlation not a measure of linearity from the that article. See page 10.
That is why I asked what parameter is more suitable to interpret linearity than coefficient correlation. :wink:

Posted: Tue Apr 26, 2005 1:17 am
by tom jupille
I'm not sure that there is any way to "interpret" linearity. What you can do is compare a linear fit to some other fitting function. The sum of the squared residuals will then tell you which function is "better" (i.e., accounts for more of the variation).

Posted: Tue Apr 26, 2005 5:40 am
by bert
A nice method is descibed in the book: 'Use of statistics to develop and evaluate analytical methods', Grant T. Wernimont, AOAC 1985, ISBN 0-935584-31-5. In brief, the procedure to investigate the accuracy implies the comparison of the differences between the known true concentration and the measured concentration. This difference is due to method bias and/or random errors. When significant errors are absent, a straight regression line with a slope of '1' and a intercept of zero should be obtained by least-squares fitting by linear regression of the known versus the measured concentrations. Random errors are demonstrated by scatter of the values around the regression line and cause (significant) deviation of the slope from '1' and/or of the intercept from zero. A proportional systematic error can be distinguished by a deviation of the slope of the regression line from '1'. A constant systematic error will cause a deviation of the intercept from zero.

Statistical evaluation implies testing whether the data of the regression line Y=aX+b in which Y=measured concentration and X=theoretical concentration can be described by the equation Y=X (b=0, a=1). The data can also be used to calculate LOD and LOQ.

In my experience, unfortunately, this approach is not accepted by authorities for bioanalytical method validation.

Posted: Thu Apr 28, 2005 7:20 am
by HW Mueller
(Back from a few days off)
The example I gave above passed all of Bert´s "tests", yet there was that awful systematic error (up to 100%). The really bad part was that most patients´values were in that region where the carryover mattered. Could I have proceeded with this anal. and still be within regulations???

Posted: Thu Apr 28, 2005 7:44 am
by bert
Hans, welcome back!. I would like to see those data and play around with them ......

Posted: Thu Apr 28, 2005 4:24 pm
by Fernando
Hi syx,

I agree that plotting the data and using the eye is very useful while you are running your standards, I myself use a simple Excel spreadsheet to control the data I am gathering.
But for true linearity I prefer to use a program like SPSS for Windows you can test the first, second or third degree between the data and concentrations.
If the equation is y=mx + b; m is not 1 but another figure I think this is because I am comparing concentrations in µg/ml with the response of the HPLC that is in mAUs, b is very near 0 or 0.
You can also use ANOVA to test the linearity of your data.

:D

Posted: Thu Apr 28, 2005 6:08 pm
by JI2002
I just want to jump in for this topic.

There are two statistical terms in regression analysis (not always simple linear regrassion which means linear regression between two variables, dependent variable y and independent variable x): Correlation Coefficient r and Coefficient of Determination R^2. There are no true statistical meaning for r^2 and R. But in simple linear regression R^2 = r^2.

Although r is the quatitative measure of the correlation between two variables, in my opinion, it can be used to evaluate linearity of the curve. Remember r^2 = R^2, also for a perfect linear curve r = 1 or -1.

Ordinary least square (OLS) is mostly used technique for linear regression. OLS is used based on some assumptions, two of the important ones are:
1) residues (y-^y) are normorlly distributed
2) standard deviation of the residues in all levels of x is the same

But in reality, the second assumption is not true in chromtography field. The varience of the residues is the smallest at the average of x, it gets bigger when x gets bigger or smaller. In this situation, weighting is used to compensate the second assumption. This is also why you want the quatitive results as close to average of x as possible when you do a dilution(not considering reporting limit).

R^2 is the one you want to use to evaluate your curve, no matter what kind of model you are using, it can be
y = a + bx
or
y = a + bX1 + c X2 + ...
or
y = a + bX + cX^2 + dX^3 +...

in the second and third model, r is not a good term to evaluate your model.

Posted: Fri Apr 29, 2005 2:30 am
by JI2002
I want to correct myself and make more comments:

1) correlation coefficient r is calculated from the data points collected from the experiments without doing any regression analysis, which means you can have a r without a curve whether it's linear, quadratic or cubic model.
so, statistically, r is not a measure of linearity but a measure of the strength of correlation between two varibles. But for simple linear regression, R^2 = r^2, if SOP requires a R^2 > 0.990, mathmatically, as long as you have r > 0.995, the curve is good fit.

2) In real world examples, the residuals of all levels of x don't have a constant varience, the varience is smallest at some point of x ( I don't know if it's the average of x), it should be at low end of the curve. In this situation, weighted least squares (WLS) should be used as the fitting technique.

Putnam, I think this is why runing duplicates of each level is a good technique in calibration. If you do scatter plot of your data, you can see the standard deviation (SD) at high concentrations is bigger than the SD at low concentrations, this can help you to figure out the weights for use in WSL.

3) Confidence interval for y is the smallest at average of x, the result will be more accurate if it falls into this region. For example, a curve has five standards at 1, 5, 10, 20 and 50 UG/L, then the quatitive result of about 17 UG/L @ 5X dilution in the sample is more accurate than the result of about 40 UG/L @ 2X dilution.

4) Some members point out that you can have a great R^2, but at low concentration, the quatitative result can be way off the true value. In my opinion, a good curve doesn't mean the best curve. If you have enough data and do a residual plot, you can find that one of the assumptions that ordinary least squares is based on is violated. Are the residuals normally distributed? Are they independent? Do they have a constant SD? If the answer to one of these questions is no, corrective action needs to be taken to address the problem. Maybe the curve is not liear(polynomial model), or the concentration range is too wide to use one model, and so on.
Also RSD% of response factors is often used by chemists in environmental field, usually what happens is, if a low point of the curve is out of whack, RSD% will fail even R^2 is 1.000.

Posted: Fri Apr 29, 2005 5:44 am
by bert
If the equation is y=mx + b; m is not 1
To Fernando:

I was talking about the comparison of the differences between the known true concentration and the measured concentration.....

A way to evaluate linearity. :wink:

Regards Bert

Posted: Fri Apr 29, 2005 6:30 am
by HW Mueller
Bert,
the data should be in J Chrom B, 678, 137(1996). Sorry, no time at present for more.
Now, instead of doing all this statistical fudging I just, later, cleaned the Rheodyne (taking it apart, at times) or used a homemade injector which has no carryover.

Posted: Fri Apr 29, 2005 6:57 am
by bert
Hans,

thanks for the reference. I will try to get a copy of it. Being pragmatic can often provide good solutions. However, I think applying statistics can be very helpfull to support conclusions, based on analytical data. This is no fudging, if yo mean 'vertuschen' by that. :wink:

Regards, Bert

Posted: Fri Apr 29, 2005 2:58 pm
by HW Mueller
More in the sense of durchwurschteln.