-
- Posts: 22
- Joined: Tue Apr 04, 2023 5:48 pm
I've got a curious case here that I'm hoping someone can help me diagnose. I'm working with a client to develop a purity method for a small-molecule API. The method they've developed is quite conventional (C18 column running an ACN/Water gradient with phosphate buffer).
MP A: 20mM Phosphate buffer, pH 3.0 in 5% ACN
MP B: 20mM Phosphate buffer, pH 3.0 in 70% ACN
Column: Waters Acquity BEH Premier C18, 2.1x150, 1.7µm
Flow rate: 0.25 mL/min
Injection vol: 1 µL
Detection: VWD @ 230 nm
Gradient: shown as overlayed red trace in the chromatogram pasted below
Client is exclusively using HPLC-grade reagents, and I've instructed them to check pH of MP on a separate aliquot of MP to avoid contamination. We can't seem to figure out why the baseline drifts downward as % MP B increases. Theoretically, MP B should have a higher response at 230nm than MP A, right?
Note that they have been running this method for months, and it has always looked like this (ie, not a new issue). Any ideas what might be causing the downward drift as the % MP B increases?
