Sorry for the delay in getting back to you all but I've been discussing the issue with the technicians and scientists who are using the system.
First, and I apologise, but we aren't using surrogates (someone in SOP production wasn't using the terminololgy correctly). We take 10 mL of sample and meter in Internal Std prior to the trap. We've checked the sampler and it's taking up and using the full 10 mL.
Periodically, in longer sequences we will see an injection of a std/QC sample where the analyte responses in a STD or QC sample are 1.5 - 2 x expected values. The vial (10 mL sampled from a 40 mL vial has not been over sampled (visual check) and the IS responses are in line with IS responses all other QCs and STDs in the sequence.
At present we have only checked out STD and QC solutions on this injection system as we get it ready for the production environment. However, we have not seen excessive variation in area counts in the IS from the MS detector (variation in area counts is at worst case ~13%. Consequently, we don't suspect an error in IS delivery and based on the lack of variation we don't suspect a trap issue (although we have tested a second trap which did not cure the problem).
We have re-run the same sequence using the same vials (now containing 30 mL in a 40 mL vial) and the vial which prevously exhibited the analyte resonse increase no longer shows the same effect, peak responses are as expected Is responses are comparable to the other vials within the sequence.
When it occurs it appears to be random and not associated with specific STD/QC level, vial position or injection in a sequence. It may/may not occur in a sequence.
Since it seems the area counts for the internal standard are consistent and the volume of sample removed is consistent then I would begin to look beyond the autosampler as being the problem. Changing the trap helps eliminate the concentrator also. With the overall stability you seem to be getting it is difficult to point at any one piece of hardware as causing the problem, which then leads into software problems.
This falls under the first rule of troubleshooting which I was taught; #1 Make sure it is plugged in! Of course it is since you have results, but check the simple stuff just in case. Look closely at the sequence and data to be certain that no multiplier has been put in that would alter the calculation of the data. I have encountered this especially with a new operator that someone puts something other than (1) into the multiplier field( can also be named "sample amount" or "dilution" or any number of other names depending on the manufacturer and software). This would easily explain how the calculated values for the QC and Standard concentrations would change while all other things remain the same.
If the actual area counts for the target compounds are increasing 2x while the internal standard area counts remain the same between two injections of the same standard level then it would be a more complicated problem. In the last 20 years I have had one instrument that had a problem with providing consistent voltage to the Electron Multiplier, which cause similar problems. I first noticed it when a single peak would seem to drop by 10x in area counts compared to what it should have been. It was totally random at first. I began to examine the baseline carefully and noticed there would be steps all along the baseline where it would rise and fall, sort of looked like a random square wave trace. The longer the instrument would run the more frequent it would happen. If I vented it, shut it down, then restarted it would go away for a few days then begin again. It was a 5970 and the Agilent tech replaced pretty much every part in it and we still couldn't fix it so it was the first one I traded up for a 5973. Never did figure that one out