-
- Posts: 2
- Joined: Mon Sep 12, 2016 4:30 pm
Noob here and first post.
Can you brainiacs pls advise on how one may determine the performance margin of error for a chromatographic instrument and method? I’d like to be able to format our in-house Reports of Analyses with a “Mass Fraction: 1.33% +/- 0.02%” or whatever, but I’m too dense to figure it out.
This is in-process analytic chem for production formulations where the analyte target mass fraction is approximately known and I dilute unknowns accordingly.
From 5-point calibration, I’ve used the LINEST array function to generate the std error in the instrument response, from which I’ve reverse-generated a plus or minus analyte conc that ostensibly would be the output error of subsequent assays.
But the std error output is so small relative to the Y-int that it returns a negative value:
Std err: 7,814
Slope: 7,172
Y-int: 15,530
Y = mx + b
X = (y-b) / m
X = (7,814 – 15,530) / 7,172
X = - 1.075
Am I sposta take the abs value of this? If so, this figure is the margin of error only for the calibration conc of 150 ppm. To produce a margin of error for assays of unknowns, I need to scale up by the dilution factor between them, yes?
But then how to transpose this value into a +/- % of the unknown conc to put on the report? When I scale it all up, the margin of error for all assays, no matter what the original conc, is identical. Is that right? Surely the margin should vary depending on how close the unknown is to the calibration conc of 150 ppm, hmm
I feel like I’m overlooking some fundamental identity here.
Pls help, I am absolutely baffled.