by 
AA » Thu Jun 07, 2007 11:16 pm
													
																																																									
					 
					
						pharmason
You have a little misunderstanding how a PDA works.
In a PDA all light from the lamp passes through the sample, hits the grating which divides the light into it componant colors (wavelengths). Nothing moves, setting a narrower wavelength range does not increase sensitivity on a PDA.  However, you can increase sensitivity on a PDA by defining a wider bandwidth (which is really taking an average over a larger number of doides).  Here often lies the sensitivity difference between PDA and Single channel, many single channel detectors have a fixed slit width of 5 nanometers, PDA's are most often run at higher resolution, like 1.2 nm (so you can see the spectra detail). Depending on the shape of the spectra, the absolute absorbance difference at a given wavelength can be small or quite large.
In regards to the original question, one lab get 0.18% and another gets 0.22% fror the same batch, I would bet that those two values are statisiticly the same. Besides that, with a cutoff of 0.20% and you get 0.18% you better have very high confidence (very good validation results) in your method to routinely pass batches at that close to the fail cutoff. You did do some inter-laboratory comparisons during your validation, right?