Oh, if you're getting decreasing response through a sequence in electrospray LC-MS, that's also quite normal (sadly). Unfortunately even if you're not gunging up your column at all, you may be gradually depositing a bit of dirt in the spray chamber, and ionisation efficiency will decline. With some types of sample it doesn't (or not too badly) and depending on the sort of environment in which you work (level of QC) you may be able to use external calibration. In others situations, the drop of efficiency is too serious, and perhaps you could consider internal standard calibration.

If you can include a good internal standard (heavy-isotope labelled things are best, as they will very nearly coelute and have nearly identical chemistry to your analyte, meaning that they will be measured at the same efficiency) your problem will go away. Your calibration will be of amount, or amount ratio, plotted versus area ratio (ratio of analyte : internal standard), so if both signals decline together, the ratio remains the same. Incidentally, internal standards also correct for issues during sample prep, a nice bonus.

If you can't include an internal standard, there are still things you can do: for example, I have resorted to paired runs, running the sample twice, with and without an addition of a spike of a known amount of analyte. This tells me the instrument's efficiency for that sample. I still use external calibration in this case, but correct the final amount by multiplying up based on the known efficiency of measurement of the spike. It's a bit of a fudge but in a research environment it's been a helpful approach from time to time. It's a sort of trimmed-down "method of standard addition".