-
- Posts: 6
- Joined: Mon May 30, 2022 11:12 pm
I was wondering if I could get some help understanding internal standards. I am trying to develop a method to quantify psilocin in whole blood. I am using d10-psilocin as my internal standard. In my LC-MS/MS method, both psilocin and d10-psilocin elute at the 2.2 mins mark. As I was starting to estimate my linear range, I noticed that as my analyte concentration increased, my internal standard (ISTD) also showed an increase in signal, especially with the higher calibrators. This could be seen even if I prepared my whole calibration line in MeOH, and not in the whole blood matrix. It would have made sense if my internal standard response may have started showing slight decreases in signal due to ion suppression by higher analyte concentrations, but I am unable to explain this positive correlation. Theoretically, if I am spiking the same amount of ISTD, I should expect the same response all throughout the calibration line right? Might any of you help me understand this?