Advertisement

1/x vs. 1/x2 weighting

Discussions about GC-MS, LC-MS, LC-FTIR, and other "coupled" analytical techniques.

2 posts Page 1 of 1
To determine the appropriate linear regression weighting factor to employ (if any), I once saw an example where the natural log of instrument response was plotted against the natural log of standard concentration. The slope of this line was then used to determine whether unweighted linear regression, linear regression using a 1/x weighting factor, or linear regression using a 1/x2 weighting factor would be used, according to the following criteria: slope < 0.25 = unweighted; 0.25 < slope < 0.75 = 1/x weighting factor; slope > 0.75 = 1/x2 weighting factor.

It is understandable that in the situation where the slope < 0.25, the variance might be considered sufficiently homogenous to justify the use of unweighted linear regression, and that if the slope of the above-mentioned line were sufficiently large, 1/x2 might be the appropriate weighting factor. However, I can not understand why these specific slope ranges were chosen, or if there is any sort of mathematical justification for them. Can anyone provide any insight? Thanks in advance! :?:

Let me preface this with the disclaimer that I am not a statistician. I don't know the answer to your question, but here's what I can glean from reading the NIST statistics handbook on-line (http://www.itl.nist.gov/div898/handbook ... pmd452.htm) :

Unweighted least squares implicitly assumes that the absolute errors are approximately the same across the entire data range.

When this condition is not met, there are two approaches to getting better estimates of the regression relationship:
- transform the data such that the (transformed) errors are equally distributed.
- weight the contribution of each level in inverse proportion to the estimated error at that level (i.e., give the more precise measurements higher weight).

Transformation is the preferred approach if the general distribution of errors is known.

In the case of chromatographic data, the usual assumption is that the relative error (%RSD) is approximately constant over the entire range. That means that absolute errors increase with peak size. Transforming the data to log-log should equalize the error distribution. To my reading, this seems to be the best way to go.

Weighted least squares explicitly assumes that the error distribution at each level is known. This is not true for chromatographic data.

Here's where my understanding gets sketchy: If we assume that % error is constant, 1/x weighting is appropriate (in other words, the error at each level is divided by the level). I'm not sure when (if ever) a 1/x^2 weighting would be preferred (how non-linear does the response have to be? and what assumptions do you have to make for error distribution?).
-- Tom Jupille
LC Resources / Separation Science Associates
tjupille@lcresources.com
+ 1 (925) 297-5374
2 posts Page 1 of 1

Who is online

In total there are 2 users online :: 1 registered, 0 hidden and 1 guest (based on users active over the past 5 minutes)
Most users ever online was 11462 on Mon Dec 08, 2025 9:32 pm

Users browsing this forum: Semrush [Bot] and 1 guest

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry