-
- Posts: 21
- Joined: Tue Apr 09, 2013 3:28 am
I am new to chromatography, and just started working on a 5890 Series ii with FID / ECD. I'm not sure what to expect as far as deviation based on injection technique.
I found a Hamilton Digital Syringe (with series 700 syringes) on ebay new, but cheap to try to minimize variation in my sample introduction. It measures the fluid to .05ul with a 0.038% accuracy (based on the calibration certificate) and has a plunger brake to help smooth sample introduction. I'm injecting 1ul of sample and have been able to get within <1% deviation when injecting a known reference. Then, out of the blue, 100%, or even 1000% deviation. When I'm injecting and re-injecting actual samples, I can't seem to get a straight answer out of the machine.
The technique that seemed to provide to most reproducibility was to start with a completely evacuated syringe and needle. draw 1ul of sample, leaving the air that was in the barrel as a plunger cushion, then draw up 1ul of air to sandwich the sample in an attempt to minimize discrimination. I am so careful to try to be precise, but my results still seem to vary so widely. I feel that I have eliminated paralaxing errors, am certain that there is no bubble in my measurement, and try to time everything exactly the same each time.
Sorry for noob'ing up the place, but I really am so new that I don't really even know how much technique will sku my quantification, or really even how to tell if my GC in operating within intended parameters. Any tips or advice that is offered will be greatly appreciated. If I end up saying things that don't even make sense, I apologize.. it is because I am just now starting to learn terminology.
Warm regards,
Mike