Page 1 of 1

Valley to Valley?

Posted: Sat Aug 13, 2016 10:17 pm
by EmpowersBane
When is the most appropriate time to use valley to valley when using Apex Track to integrate a gradient run with lots of impurities? My Data Reviewer insists that it should be always used on a set of peaks which are integrated on a sloping (upwards) baseline but I disagree because ApexTrack uses a comparison of the internal slopes of a peak to extrapolate a suitable and accurate baseline. even if the peak "lands" on a drifting baseline. When v/v is used on these peaks the peak areas are massively increased due to the extra mass resulting from dragging the peak width down to the previous baseline of Slope Zero (straight line), and I don't believe in this scenario that peak area and height are calculated accurately.

I DO use v/v for a cluster of peaks that are very under-integrated as its obvious when applied that the peak areas and heights are appropriate but that's not always the case. My Data Reviewer insists we should be integrating to account for "Worst Case Scenario" plus she also says that, when overlaid, the blank sample doesn't rise in the same place so therefore the peaks should be brought back to the original baseline but I disagree.

Any thoughts?

Re: Valley to Valley?

Posted: Wed Aug 31, 2016 6:04 pm
by bundy5555
Valley to valley itself doesn't add more peak area to your peak. You have to fine tune your other integration parameters to stop the software using time 0 as your baseline.

Whether or not to use v/v is really depended on each scenario. Follow your company SOP on integration if you have one. Otherwise, if you have a group of peaks that partially overlap each other, then use dropline instead of v/v. If two peaks are well separated, then use v/v.

Re: Valley to Valley?

Posted: Thu Sep 01, 2016 5:14 pm
by DR
1) A picture is worth 1000 words.

2) Your reviewer is failing to allow for changes in baseline due to excipients etc. in your samples. Attributing all area resulting from a baseline rise to the sample is not a good idea. Reviewers should refrain from using words like "always". That said, there are times when a clump of poorly integrated peaks on a rising baseline are better integrated using Gaussian Skim, Drop down or some other means to include those extra areas between the start and end of the clump.

3) You should be able to integrate consistently, reasonably and without much (if any) manual input. Do that, and regulators won't bother you much over integration details. While it is important not to under-report degradants and impurities, the worst case is when you exaggerate them to the point that you're failing batches that should pass.

PS - Apex track looks for a change in sign of the second derivative of the baseline slope to establish the presence of a peak.

Re: Valley to Valley?

Posted: Wed Oct 12, 2016 1:23 am
by Mr. G
I do not understand the problem. v/v, drop-line, forced peak, gaussian skim, exponential skim, traditional integration, or apex trac. I mean really. really? who cares? as long as consistent integration is applied and the sample fails or passes consistently what difference does it make? in the end one cannot legally force a result. Integration SOPS are based on what? The option of because I said has no credibility. USP allows for generous GC chromatographic variation. most do not understand GC application. integration algorithms are proprietary and not divulged for legal reasons. in the end as long as one is consistent, and reporting legally, integration parameters may be applied as the analyst wishes. otherwise the governing authority should employ one with the credibility to impose limits on the math. as the population expands so does bureaucracy. SCIENCE!