-
- Posts: 292
- Joined: Wed Jan 19, 2005 2:20 pm
I am debating internally at my company that when determining relative response factors (RRF) in HPLC, do the analytes necessarily need to be at the same concentration levels on the linearity plots (i.e. RRF's are being calculated from the slopes of the linearity plots).
For instance is it acceptable to compare slopes from the main analyte peak at say 10 to 150% of a nominal loading with impurity linearity slope from 0.1 to 1.0% (assuming 4-5 data points across these levels)? Or is it preferable to perform the linerity loading plots at equal concentrations i.e. both main and impurity at 0.1 to 1.0%?
Statistically/mathematically which is the preferred approach - equal concetrations for main and impurity peak - or it does not matter?
