Let me preface this with the disclaimer that I am not a statistician. I don't know the answer to your question, but here's what I can glean from reading the NIST statistics handbook on-line (
http://www.itl.nist.gov/div898/handbook ... pmd452.htm) :
Unweighted least squares implicitly assumes that the absolute errors are approximately the same across the entire data range.
When this condition is not met, there are two approaches to getting better estimates of the regression relationship:
- transform the data such that the (transformed) errors are equally distributed.
- weight the contribution of each level in inverse proportion to the estimated error at that level (i.e., give the more precise measurements higher weight).
Transformation is the preferred approach if the general distribution of errors is known.
In the case of chromatographic data, the usual assumption is that the relative error (%RSD) is approximately constant over the entire range. That means that absolute errors increase with peak size. Transforming the data to log-log should equalize the error distribution. To my reading, this seems to be the best way to go.
Weighted least squares explicitly assumes that the error distribution at each level is known. This is not true for chromatographic data.
Here's where my understanding gets sketchy: If we assume that % error is constant, 1/x weighting is appropriate (in other words, the error at each level is divided by the level). I'm not sure when (if ever) a 1/x^2 weighting would be preferred (how non-linear does the response have to be? and what assumptions do you have to make for error distribution?).