-
- Posts: 3
- Joined: Wed Sep 08, 2010 9:18 pm
I have been working on a method that has just given very odd results. I have plasma from rats that were dosed with 50mg of mesalamine (5-aminosalicylic acid), which I have extracted with acetonitrile.
My system seems to check out just fine, I have a gradient of .4% phosphate buffer at pH=2.7 and methanol, on a C18 Waters HSS column, and I am analyzing at 254 and 320 nm. My standards have roughly twice the absorbance at the upper wavelength for the analyte and it's degradant.
The problem is, when I inject the extracted rat plasma samples, my absorbance values at the upper wavelength seem to drop to levels that are too far below my 1ug/mL standard to quantitate. The peaks seen in the lower wavelength seem to be unaffected, and end up being larger than the upper wavelength values.
My retention times are very consistent between the two readings, and I am getting really good separation, so I don't think that there is co-elution. I have a few system suitability/standard reinjections, and those don't seem to be affected. So, I am at a loss as to what would cause this. I think it's something in the plasma that is killing my absorbance, but I am not sure what that would be or how to get rid of it.
Any ideas?
