-
- Posts: 1
- Joined: Tue Jun 07, 2005 5:30 pm
In cases where the RRF is sensitive to external factors do we simply challenge our RRF during method robustness testing (and during development, of course!) or is there a way to incorporate RRF determination into a method validation? What happens if we do find that the particular impurity has an RRF sensitive to these small changes?
FYI-an example of this problem came during a method validation in which a particular impurity exhibited a change in RRF approaching 50% between two different detectors with different lamp ages. It is unlikely that lamp age caused this in and of itself, but rather it is possible a small shift in actual detection λ that resulted from the lamp age could have caused it. Regardless of the actual cause, we need to be confident that the RRF isn't going to be changing significantly from run to run.
I have found some helpful information posted online from a method development and validation performed by BAS Northwest Laboratory (we know we can trust their work!), but it still doesn't directly answer this question. This seems to be an issue that doesn't get discussed much but has much importance in the field of analytical chemistry. Please share if you have any insights!
