by
lmh » Tue Jul 07, 2020 5:44 pm
I'm not sure you need a paper. If you split the job into two separate parts and imagine two separate people doing it:
Person 1 takes a sample, dissolves and dilutes it. They give it to...
...You, Person 2, who analyses it using the well-known approach of standard addition. You don't need to worry about the sample before it reached you, just that you have your method validated and functional for a sample in the correct matrix, at the correct expected range of concentrations etc.; frequently an analyst won't actually know the details of how the sample was taken.
Provided Person 1 has sufficiently good evidence that their dissolution and dilution is reliable and good, then the whole method, end-to-end, is good.
Given that you're starting with a lot of solid and 1% is the expected analyte, you'd see, physically, if anything went wrong during dissolution (you'd see left-over solid). If anything went wrong in dilution, you'd also see it at that sort of level (if it sticks to surfaces, or precipitates); the sort of thing that frightens me in dilutions isn't the 1%, it's the 1pM solution which manages to find an active site on a piece of glassware and disappears completely.
We have to be able to believe in our ability to dissolve and dilute a solid, otherwise we wouldn't be able to quantify anything that's sold only in solid form, because we couldn't even trust ourselves to make the calibration standard.