Page 1 of 1
Reporting of impurities during developmental stage
Posted: Wed Nov 29, 2006 7:14 am
by Sunjay
Dear All,
Is it a good practice to report the impurities as per reporting threshold (as per ICH guidelines) during the developmental stages of a drug product (For ANDA or NDA) or the impurities should be reported as per LOQ?
I think it will be good practise not to follow the ICH guideline during Research & Dvelopment phase of stability analysis and better to report as per the LOQ of the method.
What are your suggestions or experiences?
Please share..........By sharing we can learn more...........
Regards
Posted: Wed Nov 29, 2006 8:50 am
by danko
Hi Sunjay
LOQ is just a justification of the method of analysis and as such it has nothing to do with the quality of the product. So it should be seen as a tool of quantification/documentation of the impurities one can encounter in the product.
A good practice is to identify and characterize impurities (i.e. molecular structure, potency, toxicity and so on) exceeding 0.1 % of the active substance.
Some people tend to postpone this step in the R&D phase, but in my opinion it is a bad practice, because much of this information might be needed in order to develop a good, stable product/formulation, which will facilitate the actual registration.
Posted: Wed Nov 29, 2006 9:56 am
by PJ8
Ideally you want to characterise majoy impurities as early as you can. Obviously there's a balancing act, you don't want to throw loads of work at a project that then dies, but you want to identify your key large recurring impurities early on.
If you know what an impurity is, it is easier to keep track of. Also if you know what it is it's more likley you can hypothesise on how it is formed and ultimately you have more chance of controlling it and minimising it's presence in drug substance.
Think DFM folks!

Posted: Wed Nov 29, 2006 7:51 pm
by Bruce Hamilton
It depends on the environment, and different companies have their own guides for different stages. I doubt any company routinely considers the possibility of product failures, they usually are optimistic.
If the product manufacturing processs is still undergoing development, the only rationale for allocating resources on impurity characterisation concerns effects on bioactivity or chemical property work.
In general, the most important aspect of early phase product is consistent product quality, not fully characterised product profiles. Obviously major impurities ( eg >0.1% ), should be monitoring, but perhaps not characterised.
The major effort at characterisation usually occurs once the manufacturing process has been defined, then the need for full impurity profiles ( including reference standards for impurities ) increases greatly. It always pays to interact with the process development people to ensure resources are appropriate.
An example is the use of reverse phase HPLC to highly purify mg quantities at the start of development, replaced by normal phase Biotage for gram quantities, replaced by other techniques ( eg crystallisation, ion-X ) for kilo quantities. The final product impurity profile may change with the process.
Bruce Hamilton
Posted: Fri Dec 01, 2006 7:19 am
by jzt
Sunjay,
I thnik this depends on your "LOQ of the method". I assume you meant that the LOQ is less than the reporting threshold of 0.05%.
do you have a real LOQ for that specific impurity defined in the method ? A high purity standard of that impurity is normally required to determine the true response and LOQ. such standard is hard to come by during early development. If you are assuming the impurity has the same response factor as the drug substance, you introduced more uncertainty here. In addition, isn't it true that LOQ kind of varies depending on the HPLC you used, the age of UV lamp, the noise level of that perticular run, etc.?
If your intension is to keep track of impurities at low levels during stability study, you may put such impurities into different classes: report an actual number for those above the reporting threshold; report those detected by the method (> LOQ) but below the reporting threshold as "< reporting limit (e.g., < 0.05%); and report those not detected by the method (e.g., < LOQ) as "not detected". the "<0.05%" class will kind of give you a warning of a potential degradant.
What is your company's current policy on this? would you like to share?
Posted: Wed Dec 13, 2006 8:21 pm
by gtma
It depending on the stage of the project and whether you ask a statistician or a QC person. At formulation development stage, I would recommend you have a sensitivity solution at LOD and report impurities at or above LOD so you can more accurately trend data (assuming the LOD is around 0.1 to 0.2%). Otherwise, I would propose to report at the LOQ (assuming it is lower than the reporting threshold) for data trending. One reason I don't like to use LOQ reporting strategy for formulation development is in situation where there are several new impurities below the LOQ that forms over time and would not be capture in your reporting strategy if LOQ reporting strategy is used. For commercial stage, LOQ or reporting threshold should be sufficient for trending of stability data.
Posted: Thu Dec 14, 2006 1:27 pm
by Yorkie31
(assuming the LOD is around 0.1 to 0.2%).
I really would be looking for an LOD around 10 times less than these figures. Typically we would opperate with an LOD of 0.02% and a LOQ of 0.05%.
Anything less than LOD report as Not Detected.
You also need to consider what you are looking at. Are they process impurities, that in reality should not change, or are they degradents, in which case you may expect them to increase.
My thought is the more data you can get earlier on, the better.