-
- Posts: 23
- Joined: Mon Apr 11, 2005 6:46 pm
I have been puzzled by this for weeks now and hope someone can offer some insight.
We run a headspace GC method for residual acetone in a 1 gram product (lactide-collagen polymer). Its a P.E. -HSGC and we use 22 ml vials with 1 gram of material and 5 ml of NaOH to solubilize the product (the headspace method has not changed).
Current we calibrate at a single point of 1000pmm with acetone in MeOH. We use 1 ml of cal standard in 5 ml of NaOH and no product.
We would like to change to a 2-POINT CAL at 1,000 AND 5,000 PPM. Problem is the 5,000 PPM STD spikes the detector. We can fix that by changing the std volume to 0.1 ml from 1.0 ml. We also learned that the material itself interferes with the acetone recovery so we have added 0.9g of "clean" product to the cal sample. When we do this we get a nice cal curve with perfect checks at 100, 500 and 3500 ppm.
The question I have is- What does the change form 1 ml of standard to 0.1ml of standard mean? Is it still relevant to my product? My older data will not correlate and when we reprocess the data through the new cal curve the acetone content increases by 60%. This makes me think that there is something fundamentally wrong with the changes. However if I run a samples from the same lot through both methods (I can't rerun samples since the dissolution is destructive) I get the same ppm value for each sample in each method.
Bigger problem is how do I explain to my QC group that the change from 1 ml to 0.1 is nominal and still represent the actual value?
Am I on the right path or have I totally screwed this up?
