-
- Posts: 7
- Joined: Fri Sep 20, 2013 8:20 am
We have developed / validated a method where impurities are calculated by the known formula:
%imp= (Atest/Aref)* limit.
Comparison of the % percentage for an unknown imp. with specific rrt with the %area presented in the chromatogram shows really high differences. Is this normal??
eg: impurity calculated with the method formula: 0.015%
impurity as given in the chromatogram %area: 0.22!!!
Forced degradation in acidic conditions also showed possible degradation product with the same rrt.
Is it possible that the formula ''hides'' impurities-degradation products?? --->> problems during stability !!
I mean maybe the detector's response is not correct at the concentration limit. To be noted that linearity was ok during validation..
Thank you for your help!
