-
- Posts: 9
- Joined: Thu Jan 19, 2006 12:41 pm
I've recently been running an assay to determine levels of Triethylamine in an API product we make. The method calls for an internal standard to be used - acetone. The solvent is DMF. Sample prep - 50mg sample, 3mL internal standard, mix, sonicate until dissolved.
Oven at 70C, injector and detector at 150C, column Unisole 10T + KOH (20+4%) / Uniport C, 80 – 100mesh, 3mm i.d. x 3m, glass tube.
We are observing random low peak areas for the acetone peaks, sometimes only 10% lower than expected, other times up to 60% lower. It happens to blanks, standards and samples. The TEA peaks don't behave the same way when these anomalies are observed so we've eliminated injector/instrument repeatability problems (plus the %RSD check before the run is always excellent)
Just wondering if anyone out there has ever come across such a problem and if so, what causes it and what steps can be taken to stop it.
