-
- Posts: 2
- Joined: Wed Mar 25, 2009 12:55 pm
I am working in a cosmetic / pharmaceutical laboratory for a contract manufacturing company. Well I inherited the method validation protocols, and currently the FDA is here doing an investigation. We pretty much follow ICH guidelines and do the typical LOD/LOQ, Linearity, Accuracy, Precision, Ruggedness, Repeatability, and Selectivity/Specificity. But a question arose about our stress testing / forced degradation. This is how our protocol was written in summary. Stress your standard, sample and placebo in acid, base, oxidation (peroxide), UV-Light, and Heat. Compare the peak areas of the control to that of each stress sample. A comparison of less than 5 % difference is acceptable (I do understand we should try and shoot for a 10 - 20 % degradation). And this may have been his first issue. Our problem is we only have UV detectors on our LCs. I understand with a PDA detector it would be very easy to do comparisons and ratios. I guess my question is if anybody has some suggestions as to how to go about doing this with only a UV detector? The lead chemist for the FDA mentiond it is 'accepted' to analyze at two wavelengths and do comparisions. But in response I said how would I still be sure that there is still no interference at my wavelength of interest. At this point any suggestions would help? All of our upper management looks like this right now....
Thank You For Any Quick Suggestions!
Regards
Brian
