Non-linearity When Working on the Slope of the UV Spectra - Chromatography Forum

Non-linearity When Working on the Slope of the UV Spectra

Discussions about HPLC, CE, TLC, SFC, and other "liquid phase" separation techniques.

6 posts Page 1 of 1
Hello

This seems like a basic question, but I cannot quite figure out what is causing it.

When doing HPLC with a UV/Vis detector, we often set the wavelength to a flat part of the curve (often the maxima). However, there are cases where we intentionally chose a less sensitive wavelength (for example, when we have multiple peaks and one has too much absorbance, and so would be off scale). But what I have found is that when using a detection wavelength that is on the slope of the spectra - and not on a flat part of the spectra - it sometimes causes a non-linearity in the response, or results in a significant intercept.

Now you may be thinking: There could be some slight drift in the position of the diffraction grating. And certainly that could cause it. But we are seeing this phenomenon with a Diode Array detector and, therefore, this cannot be the explanation.

Thanks in advance for any assistance. I would like to get a handle on why this is happening.

Sincerely
Adam
Dear Adam,

What bandwidth you are using on your diode array detector.
Kind regards,
Ade Kujore
Marketing
Cecil Instruments
Cambridge
United Kingdom

email:- ade.kujore@cecilinstruments.com
telephone:- +44 (0) 1223 420821
web site:- www.cecilinstruments.com
Registered Number 909536
Be sure you have 'Turned Off' the reference wavelength (minimum wavelength absorption). A slight drift in this area can cause non-linearity.
could it also be that you're only doing this with things that give very high absorbances, to get them into a "linear range", but maybe the linear range isn't as high as it should be, and therefore all of these things are coming out non-linear, while the things that are present at low abundance (and therefore that you measure at a peak-apex in the spectrum) are actually a lower absorbance and therefore have better linearity?
adam wrote:
When doing HPLC with a UV/Vis detector, we often set the wavelength to a flat part of the curve (often the maxima). However, there are cases where we intentionally chose a less sensitive wavelength (for example, when we have multiple peaks and one has too much absorbance, and so would be off scale).


Wouldn't it be easier to dilute before injecting or to inject smaller volume?

We never used a "slope" wavelength because the absorbance at the maxima was too large, but occasionally did when there was a matrix interference at maxima wavelength but none/considerably less on the slope. Just saying that we didn't do that, don't know if that part is relevant to you.
Maybe what you are seeing is related to the fact that bands in UV spectra are broadening when concentration increases.
6 posts Page 1 of 1

Who is online

In total there are 14 users online :: 1 registered, 0 hidden and 13 guests (based on users active over the past 5 minutes)
Most users ever online was 234 on Tue Feb 06, 2018 7:33 am

Users browsing this forum: Google [Bot] and 13 guests