Page 1 of 1

GPC Calibration

Posted: Mon Dec 17, 2007 7:25 pm
by Jaries
What are some ways to do internal calibration on Gel Permeation Chromotagrapy? We run narrow molecular weight standards every once in a while to generate a calibration curve but is there a way to do instrumental calibration? Just to make sure the instrument itself is running properly? Would reconditioning the column be a part of it? Any help will be grealty appreciated. Thanks

Posted: Mon Dec 17, 2007 7:42 pm
by Noser222
Other than making sure your column is clean, which will be covered by your column manual, running standards is about all there is.

As for the instrument, do the proper preventive maintenance and performance checks. Check the pump flow rate periodically. Check that the injector is precise. Check that the detectors are performing. Also, just make sure that the tubing volume on your instrument is appropriate. This usually isn't a concern with GPC columns that are 300 x 7.8 mm with 7-10 micron particles.

Posted: Mon Dec 17, 2007 10:22 pm
by Jaries
Thank You Noser222!

Would performing a plate count to check for column efficiency fall underneath this? Is that something typically done to ensure column efficiency on a relative regular basis or just cleaning the column would suffice? I am currently using Waters Ultrahydrogel columns.

Posted: Tue Dec 18, 2007 12:01 am
by Noser222
Yes, tracking plate count of a standard and pressure are a good idea.

Posted: Tue Dec 18, 2007 2:02 am
by Uwe Neue
Since in GPC all data are based on accurate retention times, which means accurate flow, you can check the system by injecting regularly the plate count marker - not so much for the plate count, but for the retention time.

Posted: Wed Dec 19, 2007 3:02 pm
by Jaries
Thanks guys.. really appreciated it

GPC Calibration

Posted: Thu Dec 20, 2007 5:41 pm
by mbicking
One final comment: as noted, retention time is the primary separation variable in GPC. You could simply re-analyze your standard as a sample, and verify that the retention times of the components have not changed. You could put fairly tight specs on the retention time windows, and that would give you more confidence in your results.

But also you should realize that GPC calibrations are log-based, and a 0.1 min change in retention time can mean a MW change of about 20,000 units in some cases. If that difference is important, then you need to be more careful with your system.

Finally, the best way of minimizing this problem is to add an internal standard to your sample. For example, dissolve your sample in a solution of solvent that contains a small, retention time marker (small molecule that elutes at the end of the chromatogram, after your peak(s) of interest). Most data systems have an internal standard feature that will calibrate and calculate the "ratio" of retention times (peak/internal standard), which corrects for minor flow rate changes.

The only concern is that this internal standard is sufficiently resolved from any other baseline changes (common in THF systems for example) that occur in that time region. But you only need a defined peak maximum, not a fully resolved peak.

Posted: Fri Jan 11, 2008 5:39 pm
by pss-usa
Do you work in a regulated testing environment, like a pharmaceutical company? If so, our firm uses an approach that looks at instrument performance and documents it over time in order to assure regulators that all is working well with your system. See link:

http://pssgpcshop.com/easyvalid.htm