-
- Posts: 1
- Joined: Tue Dec 01, 2015 4:49 pm
I am having a problem with my chromatography.
I make one parent prep of a solution and then aliquot that into 6 different vials. I run 3 of the vials at the beginning of the run and then the last three at the end of the run (for a system suitability). There is about 10 injections in between those two sets. When we normally run it, we get RSDs for the peak area around 0.8%, but all of a sudden the RSDs are up at 15%. But when I look at the first group of three and the last group of three on their own, the RSDs are tight. When the peak areas are compaired, the injections at the beggining of the run have a higher count than the injections at the end.
There is another solution prep that we make (it is a mixture of 3 things) and aliquot that into 6 different vials. 3 towards beginning and 3 towards the end of the run, this time with about 5 injections in between. Same story as the first set (the first set of three have a higher area count than the last set of 3, first 3 RSDs look good, last 3 RSDs look good, but together the RSDs are too high). But when looking at the relative peak area the RSD is 0.8%.
I have replaced the needle, septum, injection liner, jet, column, and gas tanks (helium and nitrogen). This instrument has an autosampler on it. I have checked for leaks. If it was just the sample prep, then the chromatograms would look bad all together. But since they are all part of the same "parent solution" then there isn't any variablity in the vials.
Is there any ideas to try and trouble shoot? I've done all I can and I have asked around and everyone has suggested what I have already done.
Any help is greatly appreciated.

