Page 1 of 1
Can detector to detector impurity percentage differ ?
Posted: Fri Jun 17, 2005 3:06 pm
by Narendra
Dear Chromatographer,
Keeping chromatography condition same except detector, I am getting difference in impurity percentage ? When I run same sample on both detetctor I got different % impurity ? Is it possible or any reason for this?
For your information chromatographic condition given below
Column : Flexit C18 250x4.6, 5micron
Flow rate: 1.5ml/min
Mobile phase : Buffer:MeoH (80:20)
Detection at : 351nm
Detector : Waters 486 and Tosho UV8011
%Impurity : from waters 486 detector = 4.7%
%Impurity : from Tosho UV 8011 detector = 5.8%
All most of 1.0% difference.
I am doing % Area normalisation method for calculation.
For your information Only detector change other component is same.
Thanks in advance
Narendra
Posted: Fri Jun 17, 2005 4:33 pm
by HW Mueller
What is your error range within each of the detectors (intra-detector)?
What do you mean by "area normalization"?
Posted: Fri Jun 17, 2005 4:38 pm
by adam
Theoretically, the results should be the same. But I think there are a few practical issues that could cause differences like this.
I think the most likely reason would be if there are differences in the linearity of the detectors, from the range of the impurity to the active.
Another possibility is that the wavelenghts are slightly different (even though they are set the same).
The last possibility that occurs to me is if there is difference in the sensitivity of the two detectors. Part of the impurity peak may be lost in the noise, with the less sensitive detector.
The safest bet may be to always use the same detector.
Can detectors differ?
Posted: Fri Jun 17, 2005 5:38 pm
by josebenjamin
Dear Narendra,
It seems to me that 351 nm is a wavelength not corresponding to an absorption maximum. Most likely the compounds you are analyzing have maxima at different wavelengths. Therefore small differences in wavelength accuracy can drastically change response factors.
I recommend the first thing you do is check the lambda accuracy of your units. For this you can follow the procedure described in:
B. Esquivel, "Wavelength Accuracy Testing of HPLC Detectors", Chromatographia, Vol. 26 (1988), pages 321-323
If you have problems with the procedure above, let me know.
Good Luck,
josebenjamin
Posted: Fri Jun 17, 2005 10:02 pm
by Uwe Neue
To follow up on the last post by josebenjamin:
if you are at the edge of the absorption abnd, it is also possibel to get difference in the response from the width of the band. Check in your C/U manual how this can be changed so that both detectors look at the same bandwidth.
Posted: Sat Jun 18, 2005 2:27 am
by pawan ratra
The height of the main peak is the major factor in % area difference. If the height is greater than 1.0AU, the detector is may not be linear. Thus due to change in the area counts of the main peak in two detectors may result in change in %area.
Moreover, as mentioned earlier, if the analysis is not being carried out at wavelength maxima but at the slope of UV spectra, a slight variaton in wavelength accuracy any result in large change in the %area.
Posted: Mon Jun 20, 2005 7:37 am
by HW Mueller
If one does this properly, namely to run a calibration standard and check linearity first, then about all of the mentioned differences between two detectors cancel. Of course, if noise is a problem in only one detector it will not be compensated.
Posted: Mon Jun 20, 2005 1:56 pm
by Basil
Another option is the difference in bandwidth between both detectors. Also if you are using reference wavelenght to substract noise drift.
A second option to have differences is if in one detector main peak is saturated and you don't have the real peak area. Then impurities calculated by normalization are overestimated...
Hope it helps !!
Posted: Mon Jun 20, 2005 4:40 pm
by Narendra
Thanks to each of U for your suggestion and guideline.
I did the same experiment which I mentioned in my previous post.
I found that when main peak elutes from Waters detector (486) maximum absorbance is 1.877 (AUFS set to 1.0) and with Tosho detector (UV 8011) maximum absorbance goes up to 1.000 and after that saturation of signals. there is no signals even peak were eluting. Now I will check the linerity of both detetctor with three level above and three level lower of injected concentration i.e 1mg/ml
I would like to know whether I am in right direction or any suggestion ?
Narendra
Posted: Mon Jun 20, 2005 10:59 pm
by Mark
Narendra,
You are on the right track, in order to get accurate area percent numbers all peaks in the run must be within the linear range of the detector. Your peak at 1.877 AU is most likely not in the linear range and this will change the apearant area % of the impurity. It is generally best to keep the largest peak to no more than about 1.2 AU. This should help with the agreement between the detectors also.
Regards,
Mark