-
- Posts: 4
- Joined: Thu Feb 08, 2024 3:31 pm
I have a small issue I would be grateful for help with.
I use a calibration standard that is supposed to be 100mg/L. I account for standard purity from the batch certificate of analysis, so I can report 'true' concentrations of my analytes.
E.g. if my standard purity is actually 98mg/L from the CoA and my sample gives me a result of 1mg/L from my calibration curve, I do:
(1/100)*98 = 0.98mg/L.
I routinely run one of calibration points (5mg/L) as a check throughout sequences to ensure my system is working properly (90-110% tolerance). I have recently been asked to run a sample preparation check at 5mg/L using a second 100mg/L reference standard of different origin. My calibration standards are bought in pre-derivatised but this second standard will be derivatised inhouse.
If this second standard batch has a purity of say 101mg/L I am worried that the acceptance may fail on paper due to the differences in purity, but in reality is within specification.
My question is - how do I account for the 101mg/L purity compared to the calibration standard purity of 98mg/L?
Thanks you so much for any help.