by
MikeD » Wed Jul 19, 2006 2:45 pm
For any given correlation coefficient the points might cluster around a straight line or a curve. To demonstrate linearity over the range of interest you must plot residuals against concentration, ie. the distance from the points to the regression line, parallel to the detector response axis. You need at least 10 points for the plot, some say 8 but that's pushing it. Having got your plot there are a couple of possibilities. Either the residuals are normally distributed around the zero line in which case the calibration function is linear (but not necessarily with a value of r close to 1). On the other hand the residuals might seem to form a curve, in which case you should repeat the calibration to check whether you get a similar curve. If you do then you can do one of two things. You can either fit your calibration function to a polynomial, or you can declare that the calibration is linear within certain criteria that you must decide, based on some context (data quality objective, SOP requirement etc) that only you can know about. I suppose you could use the same criteria for the polynomial fit. It's important to realise that there is no single number answer to the question – is this calibration linear ?
The standard error of the slope is only significant for a linear function and then only when you want to estimate uncertainty due to lack of fit.
I may have oversimplified this a bit, but then I am not a metrologist, I am a normal person !