-
- Posts: 366
- Joined: Wed Feb 02, 2005 9:34 pm
This seems like a basic question, but I cannot quite figure out what is causing it.
When doing HPLC with a UV/Vis detector, we often set the wavelength to a flat part of the curve (often the maxima). However, there are cases where we intentionally chose a less sensitive wavelength (for example, when we have multiple peaks and one has too much absorbance, and so would be off scale). But what I have found is that when using a detection wavelength that is on the slope of the spectra - and not on a flat part of the spectra - it sometimes causes a non-linearity in the response, or results in a significant intercept.
Now you may be thinking: There could be some slight drift in the position of the diffraction grating. And certainly that could cause it. But we are seeing this phenomenon with a Diode Array detector and, therefore, this cannot be the explanation.
Thanks in advance for any assistance. I would like to get a handle on why this is happening.
Sincerely
Adam