-
- Posts: 1
- Joined: Thu Jun 10, 2021 7:43 pm
I have an issue with a method that I am currently validating.
The method is the following:
Pre-mixed 60:40 ACN:0.1% TFA in Water, isocratic.
Flow 0.6 mL/min
Column: Cortecs UPLC Shield RP18, 90 Å, 1.6 μm, 2.1 × 100 mm
dual channel 220/255 nm, injeção volume 0.7 uL, column T: 35 degrees
I saw this on a Waters Acquity and also a ThermoVanquish UPLC-PDA as well.
Issue:
I started to develop this by mixing A and B in the instrument, but then the common ripple/wavy baseline with TFA started to appear and, to avoid this I have decided to pre mix it and this sorts the issue.
However, since I started to do this I see that there is a retention time shift in my system suitability standards (all affected in the same way and in the same direction).
Example of a peak:
1.6582
1.6523
1.6480
1.6430
1.6393
1.6351
1.6326
This is enough for me to be unable to properly identify components in my botanical extracts (complex extracts in MeOH). On my system suitability stats I do not see loss of resolution between peaks.
After each sample set I always flush it with ACN as per column recommendations. Once resuming analyses, I give the system enough time to equilibrate (close to 60m).
I am aware that some ACN evaporation may be possible in the bottle, but I see retention time shifts between subsequent runs (i.e., if I re-inject right after the first injection, I see a shift). Also, the retention time shift if to the left, so earlier retention times. I would expect that ACN evaporation would lead to increasing retention.
I was wondering if this could be related to some stationary phase degradation (the columns are relatively new). The method was based on an application note that used the same mobile phases so I am surprised if this is due to degradation - however I don't see dramatic loss of resolution between the standards.
I am a bit lost on this one, any help would be greatly appreciated.
Thank you!