- 
								
 
- Posts: 12
- Joined: Tue Mar 16, 2010 10:25 am
I am not a regular GC user, but lately I am trying to measure the activity of my catalyst, using GCMS. I want to follow conversion vs time for several substrates. I was hoping that I could skip making a baseline for all products (which I then have to synthesize and purify) by the following method:
-Inject reaction mixture at t=0 containing my reactant and an internal standard (hexadecane).
-Follow decrease of reactant peak relative to internal standard as time goes by and reactant is consumed. I don't do anything with the upcoming product peak in my GC spectrum.
I would think this method gives pretty reliable results, as long as concentration are in the lineair regime of the MS detector.
My conversions vary between rougly 4 and 30 %, which is the actual problem.
To test the reproducibility of our system using an internal standard, I injected a solution containing tetradecane and hexadecane 10 times. The ratio of the area of the two is .79 (which is fine), but the standard deviation is 2.1% of this ratio.
Of course this deviation is unacceptable if I want to substract the reactant peaks at 4% conversion. I always thought the reproducibility is way higher than this.
I detect using single ion mode, for the three highest peaks in the mass spectrum of C14 and C16.
What could be the problem of this rather low reproducibility, or is this about the best I could get and should I look for other methods.
Thanks in advance,
Bob














 
																							 
 