Page 1 of 4

effect of internal diameter of column??

Posted: Sat May 17, 2008 10:59 am
by rick1112
Hi

Can I know what is the effect of internal diameter of column on its sensitivity/column performance??

thanks

Posted: Sat May 17, 2008 3:59 pm
by danko
Both are inversely proportional to the internal diameter i.e. the smaller diameter the less diffusion (van Deemter A-term - “Eddy-diffusion")
The drawbacks to the smaller diameter is higher backpressure (unless the flow rate is reduced in accordance with the diameter) and lower load capacity.

Best Regards

Posted: Sat May 17, 2008 10:10 pm
by tom jupille
Danko, I'll quibble with the comment about the Van Deemter A-term (while agreeing with the rest!). The A-term is related to particle diameter, not column diameter. In principle, the column diameter should have no impact on efficiency (assuming the same linear velocity and particle size). In practice, narrower columns are trickier to pack and more prone to "wall effects" (that's the bad news), but offer better heat dissipation (that's the good news -- which is why they are used with very small particles at high linear velocities).

Posted: Sun May 18, 2008 8:19 am
by Bryan Evans
Benefits to smaller i.d. (for 3 & 5um particle):
- increase in sensitivity (for the same injection volume)
- can be run at lower flower rates (useful for LC-MS sensitivity)
- less solvent consumption

Drawbacks:
- column performance (e.g. efficiency) is usually decreased
due to extra column volume effects.

Below are chromatograms comparing different i.d. on Cadenza CD-C18:
http://www.silvertonesciences.com/files/TI089E.pdf

Posted: Sun May 18, 2008 9:38 am
by danko
The example Bryan links to, is speaking for itself. Just looking at it, armed with a naked eye, one can conclude that the first chrom. (2 mm ID column) is demonstrating superior performance to the 3 mm column. The third chrom is irrelevant here, as it represents another brand’s performance.
Where does van Deemter’s A-term com into the picture? Although it’s called “Eddy-diffusionâ€

Posted: Sun May 18, 2008 3:35 pm
by Bryan Evans
This is probably better data for this discussion -

Fungicides separated on Cadenza CD-C18 (3um):
http://www.silvertonesciences.com/files/TI069E.pdf

Peak height increases with decreasing i.d.
But resolution between peaks No. 2 & 3 decreases.

Posted: Sun May 18, 2008 4:13 pm
by Kostas Petritis
I'll back Tom's and Bryan's statements by saying that if you use a system where extra column effects are minimized and you use well-packed columns you shouldn't see any difference between the columns with different ID's (especially when you are not even in the capillary range).

Posted: Sun May 18, 2008 4:20 pm
by Uwe Neue
Let me go back to the original question first.

The column diameter is in principle not relevant for the performance of a column. If the flow rate is scaled to the column cross section, then the chromatogram should look absolutely the same.

This is the principle. In reality, the bandspreading of the instrument often does not let you do that. To run 2 mm columns with the same performance as 4.6 mm columns, you need an instrument with small bandspreading. If you do not have this, a 2 mm column will look worse than a 4.6 mm column, even if it is packed perfectly.

This is the story about performance. Sensitivity is slightly different. If the injected sample volume is decreased with the column cross section, then the sensitivity will remain the same (provided your detector can handle it). If I inject the same sample volume (= the same mass of analyte), the sensitivity will increase with the smaller column volume. This assumes of course that the column performance is not destroyed from extra-column bandspreading.

All this holds independent of the particle size...

Posted: Sun May 18, 2008 4:36 pm
by danko
Hi Bryan,

This example shows clear sample overload (chrom 1), marginal overload (chrom 2) and maybe optimum load (chrom 3). Especially looking at the first chromatogram one doesn’t need any kind of calculations to see the huge fronting.
As I see it, the data is meant to show different options for this application and certainly not a performance comparison. If it was a performance benchmarking the sample load should have been adjusted - just like the flow rate - in order to achieve equal optimum conditions for each column.
It is not obvious what the sample solvent is, but there might be some strong solvent effect, as well as sample overload (e.g. if the solvent is pure ACN). Strong solvent error has greater impact on small diameter columns.
Finally, if the conclusion is that band spreading is less on larger diameter columns, how come the first example, you linked to, showed the opposite? And what makes the latest example more reliable?

Best Regards

Posted: Sun May 18, 2008 5:42 pm
by Bryan Evans
Hi Danko -

The second chromatogram shows 2 things:
- increase in peak height (for smaller column i.d.) using the same
inj. volume
- some peak broadening of early eluting peaks due to
(system related) extra column effects

I think the analyte in the first chromatogram is well retained.
So - there wasn't any distortion when the i.d. was reduced from
4.6mm to 2.0mm

Posted: Mon May 19, 2008 6:38 am
by danko
Hi Bryan,
think the analyte in the first chromatogram is well retained.
So - there wasn't any distortion when the i.d. was reduced from
4.6mm to 2.0mm
I’m not quite sure I understand your reasoning, but the fronting on the first chromatogram is plain to see and I’d certainly call it distortion.
Whether the asymmetry is due to overload or extra-column effects is not that important. The important thing is that the conditions were not optimized in order for the 2 mm column to perform optimally. So a potential performance comparison in this case is not valid.

And now to all the guys that disagree with me – so far.

Let’s do a simple mental exercise:

If we assume that the system and everything else is performing according to the specifications for a 2 mm column. We load say 5 μg analyte (flow rate 0.2 mL/min) resulting in ca. 1000 μV peak height.
Then we switch to a 3 mm column (flow rate 0.4 mL/min and the same retention time is achieved). Then load again 5 μg analyte. Wouldn’t we agree that the resultant peak height would be ca. 500 μV? I believe we’d all agree on that.
But what happened? Where is the half of the analyte gone? Disappeared somewhere in the system? Of course not. The peak has just become correspondingly wider!
But we calculate the plate counts and observe that the number is roughly the same both for the 2 and for the 3 mm columns!? And there is the devil. While the flow rate is adjusted so that the linear velocity in both columns is the same, what happens after the column is far from the same. On the contrary, the mobile phase’ velocity after the column (in the capillary tube and most importantly in the flow cell) is doubled in the case of the 3 mm column configuration (remember 0.4 mL/min) which in turn has reduced the peak width to the half of the peak width observed in the case of the 2 mm column. And that is why many chromatographers are not aware of the performance differences caused by the column internal diameter.
So if we summarize all this the message will be: Smaller internal diameter leads to less band spreading (“Eddy diffusionâ€

Posted: Mon May 19, 2008 8:12 am
by HW Mueller
So Danko, are you rewriting the equations for plate count, etc.?
Incidentally, Bryan´s first example uses different linear velocities in the different diameter columns, also, I didn´t see anything on particle size.

Posted: Mon May 19, 2008 9:17 am
by danko
Hi Hans,

I’m not rewriting the equation! I’m just explaining why the peak broadening, caused by diameter enlargement, is not obvious/determinable (using the plate count equation), with different flow rates. While the flow rate needs to be adjusted in order to achieve the same mobile phase velocity in the column, one tends to forget that the detector’s flow cell dimensions are constant, thus the time an analyte stays/moves in it varies with the flow rate, which in turn influences the peak width.
So if benchmarking is performed with different flow rates one needs to compensate the peak width with the flow rate. And as we all know the peak width is inversely proportional to the flow rate.

If you read my previous post more thoroughly you’ll see it. Maybe you should forget Bryans’ examples for a moment and focus on the mental experiment I described.

Actually, I posted for some time ago (a couple of years) a SST I’ve thought of (I called it “peak sharpness) which would’ve detected this anomaly for the simple reason that the peak height was included in the calculation – which is not the case with the N equation. But this is a different story.

Best Regards

Posted: Mon May 19, 2008 11:31 am
by Bryan Evans
Hi Danko -

Thank you for your discussion. I think we all can agree that there
are benefits to using smaller column i.d. We have lots of LC-MS users using our 2mm i.d. - they're detectors work best at lower
flow rates - and our 3um particle size (2mm id) works very well at
lower flow rates. (Some of the newer MS detectors can handle
higher flow rates - so some have switched to use our 4.6mm i.d.).

The point I wanted to make was that decreasing i.d. will do nothing to increase separation. For example: If an HPLC user wants
to go from 150x4.6mm down to 150x3.0mm to increase resolution - we advise against it. It will do nothing to increase alpha
(in other words, just because 2 bands are more concentrated
doesn't mean their migration rates will change).

For conventional HPLC systems - the 4.6mm id will give you the best opportunity for separation.

Posted: Mon May 19, 2008 11:45 am
by Bryan Evans
Hi Danko -

Thank you for your discussion. I think we all can agree that there
are benefits to using smaller column i.d. We have lots of LC-MS users using our 2mm i.d. - they're detectors work best at lower
flow rates - and our 3um particle size (2mm id) works very well at
lower flow rates. (Some of the newer MS detectors can handle
higher flow rates - so some have switched to use our 4.6mm i.d.).

The point I wanted to make was that decreasing i.d. will do nothing to increase separation. For example: If an HPLC user wants
to go from 150x4.6mm down to 150x3.0mm to increase resolution - we advise against it. It will do nothing to increase alpha
(in other words, just because 2 bands are more concentrated
doesn't mean their migration rates will change).

For conventional HPLC systems - the 4.6mm id will give you the best opportunity for separation.
Sorry - 4.6mm because extra column volume effects are minimized
(for coventional HPLC).