by
ntruong » Tue Apr 18, 2006 12:41 pm
My only source of the impurity is in the degraded sample iteself (since I do not have an authentic).
Since you cannot synthesize or identify this impurity, treat it as standard. I mean you have to assume the absorbance of this impurity is the same as standard. Therefore, the RRF for this impurity is 1 with respect to known standard. I do it all the time and this is acceptable practice
If I do serially dilute this sample, I can plot the main peak area versus concentraiton since I can quantitate it against a standard. However, I can't plot the area vs conc. for the impurity since I don't know how much is in the sample to begin with. I can't use the main peak because I suspect it has a different extinction coefficient based on my lack of mass balance for the degraded sample
Make a serial dilution of your standard as you would if it were impurity. I don't know about the working concentration of your standard, but I would make it very concentrate in order to show the impurity. Let assume your working standard is at 1mg/ml (100%), you would typically prepare linearity solutions at 120%, 100%, 10%, 1%, 0.1%, 0.05%, 0.02% and 0.01% (you have to go down this low since the ICH guideline for impurity is 0.1%?I have to look it up

). You also need to determine LOD and LOQ of this standard to show that you are able to detect at a very low level.
Good luck,