Advertisement

Disput between PDA & U.V. detector

Discussions about HPLC, CE, TLC, SFC, and other "liquid phase" separation techniques.

24 posts Page 1 of 2
Dear experts

I am going to discuss one common problem which normally occurs between two diff laboratories , two diff companies who are having diff. kind of detectors like PDA & U.V.

1.How to compare or corelate results derived from U.V.detector & PDA detector for the same product same batch sample.

2. If one lab is having PDA detector & another is U.V.Detector & found different results of same batch sample of same product i.e. one lab is failing the batch & another passing . How to solve this problem.
SUNIL PANDYA

Assuming that both labs have the same reference standard and sample materials to work with, and there's nothing inherently wrong with the method (like it's using a wavelength from the analyte's UV spectrum where slopes are extremely steep), you can go ahead and check wavelength accuracy on the detectors and compare notes but I suspect that the problem lies with a difference in sample preparation between the labs. Time to buy some airline tickets and watch the other guys run it so you can spot the difference(s). Sometimes it's just a matter of being more specific about how you shake or sonicate something (these steps are rarely adequately covered by providing only a time or "until dissolved" instruction). Sonication is particularly prone to issues. Some sonicators impart a lot more heat to a sample than others do.

In other words, we'll need more details to be more helpful.
Thanks,
DR
Image

Dear DR

thanks for your prompt reply . Following is the detail

Testing method is same i.e. mobile phase , wavelength , sample concentration , injection etc... Here column oven is not required . Difference here is

Lab 1 : is using PDA detector & getting imp. A is 0.18 % ( Limit is <0.2% ) Hence as per this lab it is passing .

Lab 2 : is using U.V.detector & getting imp. A is 0.22 % ( Limit is <0.2 % ) Hence as per this lab it is failing .

Now discussion starts only on detector becoz rest everything is same.

So my Q is does PDA detector has less sensitivity than U.V. detector & detecting little less % of impurity than U.V.detector ?

If so then theorytically how much less sensitivity will be there of PDA detector than U.V. I mean in % terms becoz here difference is 0.04 % between two labs .

Hope I have tried to explain my problem in detail as per your desire
SUNIL PANDYA

Did you or they compare the signal-to-noise on both instruments?
Are the integration events and bandwidth settings truly identical?

Had a similar situation during an internal method transfer whereby the total impurities seen by the Variable UV were about 50% more than those quoted when using the PDA (same manufactuer etc.). Problem was solved when the PDA was substituted for a variable instead. The problem was isolated to a lack of sensitivity at the given wavelength from the PDA and the choice of reference wavelength. Having confirmed there was nothing wrong with the rest of the system we reverted to the PDA (having cleaned the cell and checked all the intervening pipework) adjusted the reference wavelength to give optimum sensitivity and then obtained comparable results.

Worked for us but may not apply in every case!

Steve

Uv detecter scan at single fixed wavelength and PDA detector scan for spectrum range you define (200-400) and then softwere extract chromatogram for chennal specified in your method from that range. so consider now PDA detector

it scans 200-400 wavelength in 1 min so its monochromater setting moves
such a way that it can scan 200-400 while in UV detector it has to scan only singel fixed wavelength so, number of scan is higher in UV detector then PDA hence sencitivity is higher

solution to this is, if your method wave length is suppose 254 then give pda detector range nerrow like 250-260 and then extract chennal at 254
that can minimize difference

generally if UV is 100% then PDA is 80% in terms of sencitivity.
Avi patel
Analytical Reserch Scientist
Method Development & Formulation Support.

0.18 vs. 0.22? I'm assuming that these values are vs. ~100% for an API. If my assumption is correct, you should go back and look at the CV of the method (there's a validation report for it somewhere, right?).

Unless the CV<<0.02% for your intermediate precision runs (robustness, ruggedness - whatever you call it), your problem is not whether 0.18 or 0.22 is correct, your problem is how to get rid of the impurity or get the specification raised, because the impurity is too high.
Thanks,
DR
Image

I agree with DR. If these are two different samples analyzed in two different labs, that agreement is pretty good for a small impurity. We have seen two injectons on the same chromatographic run have that much variability.
George Balock

I also agree with DR.

Depending on your method validation results, there may be no difference between 0.18 and 0.22. So, check the validation data and also check the S/N (and LOQ/LOD) as Noser222 suggested.

Also note that with a specification limit of <0.2, both 0.18 and 0.22 would be reported as 0.2 (The specification value has one decimal and the reported result must match with one decimal). Therefore, both results fail.

Perhaps your specification was meant to be <0.20. This would depend, at least in part, on your method's capability. Again, check your method validation data. Also, I recommend that you check out the ICH guidelines for specifications and impurities: Q6A, Q3A(R2) and Q3B(R2).

Regards,
Dan

pharmason

You have a little misunderstanding how a PDA works.

In a PDA all light from the lamp passes through the sample, hits the grating which divides the light into it componant colors (wavelengths). Nothing moves, setting a narrower wavelength range does not increase sensitivity on a PDA. However, you can increase sensitivity on a PDA by defining a wider bandwidth (which is really taking an average over a larger number of doides). Here often lies the sensitivity difference between PDA and Single channel, many single channel detectors have a fixed slit width of 5 nanometers, PDA's are most often run at higher resolution, like 1.2 nm (so you can see the spectra detail). Depending on the shape of the spectra, the absolute absorbance difference at a given wavelength can be small or quite large.

In regards to the original question, one lab get 0.18% and another gets 0.22% fror the same batch, I would bet that those two values are statisiticly the same. Besides that, with a cutoff of 0.20% and you get 0.18% you better have very high confidence (very good validation results) in your method to routinely pass batches at that close to the fail cutoff. You did do some inter-laboratory comparisons during your validation, right?

Assuming that you are calibrating peak area vs quantity of analyte using the actual substance whose concentration you want to measure, and with the same analytical conditions as are used to run the samples, then a UV and a PDA should give the same answer.

Peter
Peter Apps

If one calibrates and characterizes correctly than any instrument should give the same answer. I have always considered different analyte ranges for different instruments in medical quality control as an analytical bankruptcy declaration.

Dear All,

If I precisely understand the Mr. Pandya's concern, the problem is the sensitivity.

<B>1-Wheather a PDA produces less signal than UV detector?</B>

The answer to this is the PDA produces 70 to 80% less signal than a UV detector
- Provided all the hardware components are OK & the method is precisely set.

<B>2- 2nd concern is different results on PDA & UV.</B>

My answer is detto to all the expert with one addition
-Some PDA requires to set the reference signal which gets subtracted to the acquired signal. If this interferes with the wavelenth of some of the peaks of interest, may result in low or No signal.

Hope this gives some clear picture.

Regards,

Hi, Everybody

As I understood the problem : Comparision of impurity results using the same sample and same method through UV detector and PDA detector, right.

As per validation guideline no one can compare such results using two detectors having defferent specification.

You can use a different make but not different specification for official comparision in terms of precision.

I hope is clear ?
Avi patel
Analytical Reserch Scientist
Method Development & Formulation Support.

How you are estimating these impurities?? is it qualitative ( area normalization?) or quantitative using either impurity stanadrd or diluted drug standard??

If it is qualitative estimation, you bound to have this kind of variation between UV and PDA.

If it is quantitative estimation there should not be any difference between UV and PDA.

JM
24 posts Page 1 of 2

Who is online

In total there are 39 users online :: 2 registered, 0 hidden and 37 guests (based on users active over the past 5 minutes)
Most users ever online was 4374 on Fri Oct 03, 2025 12:41 am

Users browsing this forum: Ahrefs [Bot], Amazon [Bot] and 37 guests

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry