Page 1 of 1

Is there an accepted RT variation norm ? Please indicate .

Posted: Tue Sep 08, 2009 11:25 am
by Mayank72
I see many in house SOPs which mention 10 % variation in RT as acceptable ......Wonder what is the run times are too long ( imagine a 100 min run )

Is there a max acceptable limit for variations in RT ?

added (09.09.09) :-Let me try to rephrase my question to elicit response !

During Analytical Development column lot to lot variation is addressed by the ruggedness . My question is what is an acceptable % variation between columns / column lots , is there any set norm. Please advice . Thanks ! :)

Posted: Tue Sep 08, 2009 12:19 pm
by adeputy
The analytical test procedures that I am familiar with use and RSD of 2.0% for retention time of the main peak.

Posted: Tue Sep 08, 2009 3:19 pm
by Eric Moore
I believe the question is a matter of validation. Before you state how much variability is acceptable, you should demonstrate that your peak can move a certain percentage and still give acceptable accuracy, etc. This would be done as part of ruggedness/robustness, I'd say.

It also depends on whether you are talking about an impurity/related substances method, or some other type of purity, or a content method. I don't think there's percentage that's acceptable for all method types and situations.

I had an early stage method that we didn't do full validation on, and we just looked up how much our standard peaks moved during a series of runs while still maintaining system suitability, and gave that value as the RT range.

Just my $0.02.

EM

Posted: Tue Sep 08, 2009 5:17 pm
by grzesiek
are you talking about width of RT window or about RT precision (SST)

If first then "I believe the question is a matter of validation. Before you state how much variability is acceptable, you should demonstrate that your peak can move a certain percentage and still give acceptable accuracy, etc. This would be done as part of ruggedness/robustness, I'd say. " is your abswer.

If second then I must say that you would have difficulties in finding such system (after qualification done), see your system's qualification documents under system precision

Posted: Wed Sep 09, 2009 5:53 am
by krickos
Hi

As already touched at, I agree as a general analytical method requirement it is a poor tool, more traditional sustem suitability tests/calibration is generally better. A 5-10% RT difference in a lengthy impurity method may or may not have a critical impact on the resolution power of the method. However as a sanity check that the expected RT is about where it should is not wrong.

If we talk instrument qualification/periodic control, then it could be a completly different matter. Provided you use a specific and dedicated column for only this purpose, I have seen requirements for RT differences like not more than 2%.

Posted: Mon Sep 14, 2009 1:48 am
by mohan_2008
This is a good question.

It all depends on the simplicity and ruggedness of your method.

1. If only a single analyte elutes in a method (no other eluting peaks), setting a RT limit does not make any sense - Since even if the analyte peak shifts, it will be not be overlapping or co-eluting with other peaks (as there are no other peaks) - as long as the peak elutes within the set Run time.

2. There are can be issues of mobile phase evaporation (with specially low organic modifiers -say 4%) that can occur and shift your peaks drastically (specially, if your analyte retention is sensitive to organic modifier).

3. RT limits can be set when other peaks co-elute in the run. This is to ensure that the main peak (analyte) does not shift to overlap or co-elute with other peaks or vice-versa.

4. For some very specific methods such as Ion-pair or Normal phase methods setting up RT limits is a routine practice but for Reversed-phase - normally it is rare that anybody sets up RT limits. Again, it depends on the application.

5. But prior to setting up a RT limit - appropriate robustness testing must be done so other analysts do not end up into system suitability problems.

Posted: Wed Sep 16, 2009 7:38 pm
by JGK
The analytical test procedures that I am familiar with use and RSD of 2.0% for retention time of the main peak.
Same here.

The only time I've seen serious RT variations was when I ran a method with the column at room temperature in a non-airconditioned lab overnight in an english summer.

RTs varied considerably as the lab cooled overnight and slowly returned as the temp ropse the next day.

Posted: Thu Sep 17, 2009 4:38 pm
by Uwe Neue
The acceptable changes depend on the method. If the method gives you acceptable results, independent of column lot etc., then you are fine.

When you do the testing for lot-to-lot reproducibility, you must be sure that you get different preparations of the material with which the column is packed, not columns packed with the same stuff on different days. Be sure that this is what you are getting from the manufacturer.

Column to column variation between columns made from the same packing material is close to nothing, if the columns are new. The only variation is a physical variation (column volume), and not a chemical one.

Posted: Fri Sep 18, 2009 9:09 am
by grzesiek
"During Analytical Development column lot to lot variation is addressed by the ruggedness . My question is what is an acceptable % variation between columns / column lots , is there any set norm. Please advice . Thanks !" - there is no value set, as long as the method works you're ok

Posted: Fri Sep 18, 2009 4:10 pm
by Jade.Barker
We use > 2% RSD -
because - Thats what is writen in the validation procedure. The validation procedure is written with that parameter because that is the level required for the regulations in our area.

Posted: Fri Sep 18, 2009 4:31 pm
by grzesiek
or < 2% RSD?? :)

Posted: Fri Sep 18, 2009 6:48 pm
by Jade.Barker
or < 2% RSD?? :)
DOH! yes...what you said... :oops: