Page 1 of 1

Identifing impurities using Relative Rention Time (RRT)

Posted: Tue Jun 27, 2006 7:02 am
by michaelcarolus
Hi everyone

When identifing peaks using RRT by how much can the RRT of the impurity peak vary from the stated RRT in the monograph. eg. USP has a RRT for an impurity = 0.4. Does that mean the RRT can vary by ±5% or ±10%. For this example should I use a range of 0.38 - 0.42 or should I use a range of 0.36 - 0.44.

Thanking you in advance.

Mike

Posted: Wed Jun 28, 2006 7:29 am
by Peps
Hi Mike

This is a very interesting question... At our lab we don't "trust" the USP values. We always try to find all impurities and run them by ourself to establish the RRTs on our system (even though it is USP methods we always perform some kind of validation/verification...). Similar columns of different brands usually behave quite different... Anyway, once we have determined the RRTs on our system we usually don't allow more than ±5%.

/Peps

Posted: Wed Jun 28, 2006 10:33 pm
by Uwe Neue
I do not know what is stated in the particular USP method that you are referring to. In all the methods that I am familiar with, the resolution between different peaks (most often between the analyte and the internal standard) is specified only vaguely, i.e. "must be larger than..." or "is approximately...".

Posted: Fri Jun 30, 2006 9:37 pm
by ccyting
USP only specifies the geneal type column used in the monograph. I agree we all have to establish our own RRT using the standards. Once established, the RRT range is very reproducible from run to run. You can establish the range based on your validation data. I agree 5% is a good start. Good luck.

Posted: Mon Jul 03, 2006 6:10 am
by michaelcarolus
Hi Thanks

Thanks for all the suggestions. We inject all known imps when we do the validation to determine their RRT's. We will be using a RRT range of ±5% based on our retention times established during the validation.

Thanks again

Mike