-
- Posts: 3
- Joined: Mon Jun 29, 2009 2:03 pm
I recently started a summer project measuring tocopherols extracted from barley samples. All was well until recently we changed procedure to deal with some co-eluting peaks.
Here are the basic specs to give you an idea about our set-up
RF10A detector
LC6A cromatograph
Gilson 234 autoinjector
Running class VP software
Originally using Pure Hexane buffer, but 2 months ago switched to a 2%dioxane 2% ethyl acetate solution which helped to seperate some important peaks.
At first a good chromatogram would have a solute peak at 700-1000 mV with our variables coming out at about half that amount.
Currently all of our samples are coming out odd. The best chromatograms still contain all the peaks at thier proper elution times but the solute peak is relativley small at ~25mV with the variables coming out at ~100 mV. The baseline also shifts to -10Mv after the solute peak.
We also randomly get chromatograms with a larger solute peak (~250mV) a straight baseline exactly at 0, and greatly reduced variable peaks. These look almost as if you took an old 'normal' chromatogram and raised the baseline then changed that value to zero. This data a useless since it eliminates important peaks and fails to give us even relative values to work with. (I can post pictures if anyone thinks it will help formulate an opinion)
We have retested several samples with varied results on identicle samples, meaning that the same sample would kick out both kinds of baselines desribed above.
My best guesses are a bad detector (lamp?) or somthing odd in the software method. If you have had similar problems please let me know!
Thanks