Page 3 of 4

Posted: Fri Nov 05, 2004 1:51 am
by Kostas Petritis
HWM,

I think that Bert covered it... high-throughput techniques is not the reason why the combinatorial chemistry seems to fail as an approach.
Furthermore, very high throughput analytical techniques are not only needed for the reasons that you mentioned. Industry pharmacokinetics studies and everything related that have to do with the introduction of a new drug in the market needs very high throughput quantitative methods.

Furthermore, there are significant efforts for the discovery of biomarkers in the protein and small molecule level for the early discovery of different diseases. Once these biomarkers discovered, ultra fast methods will need to be developed for the screening of thousands of patients that will just go for their check up...

An area with several established biomarkers is the neonatals screening for inborn errors of metabolism. I copy paste below two sentences from the conclusions than can be found in the Health Technology Assessment 2004, Vol 8 No 12 with title: Clinical effectiveness and cost-effectiveness of neonatal screening for inborn errors of metabolism using tandem mass spectrometry: a systematic review: "New technological approaches for automated processing, coupled with computer-assisted software, would allow the analysis of hundreds of samples on a daily basis and minimise labour costs. Tandem MS has the potentail for simultaneous multidesease screening using a single analytical technique."

USA national security is seeking for ultra high-throughput analytical techniques as they want to screen large numbers of different compounds.

Finally, in the food industry, in cases like the mad cow disease, it was not possible to screen all the animals that go out to the market due to high cost, time and labour reasons, leaving always the possibility that sick animals are introduced in the market (only statistical controls are made). Ultra high-throughput methods bring the cost down in cases like this, allowing a better and global screening...

I do not say that UPLC is here to resolve all the above, I just arguing in the need of high-throughput techniques...

Posted: Fri Nov 05, 2004 9:11 am
by HW Mueller
Kostas,
Having some contacts to clinical chemists, I know of the huge number of determinations done. Much of it is unnecessary, maybe more is necessary, but not done. Certainly a large number is just bad analytics. The markers you mention are, perhaps, a very good example. Many of the past, if not most, are proven to be faulty, yet millions of $ have been put into developing mass analyses for these. I am not saying to stop high throughput, or UPLC, or even combinatorial chemistry (I am sure, though, that the latter will occupy a small backseat in the future), I just believe that more resources should be put elsewhere (presently the pharmaceutical industry seems to extend feelers, somewhat timidly, to small firms established on basis of the techniques I mentioned above).

Right now I would just love to see more done in the direction of much simpler automatic multistep (strangely called multidimensional) low pressure LC or HPLC with a completely new affinity LC included.
Incidentally, just noticed that I can´t recall having seen multistep UPLC, is it done, ……. with multiswitching?

Posted: Fri Nov 05, 2004 4:17 pm
by HW Mueller
Darn, the "developing mass analyses" above was meant to mean "developing high numbers of analyses".

Posted: Fri Nov 05, 2004 10:04 pm
by Kostas Petritis
HWM,

About your question concerning multistep/multidimension UPLC, it is possible. We have up to 8 valve systems that we are using for other reasons but they could be used for multidimension UPLC (all the valves are operating at very high pressure -10000psi-... I keep the term ultra for higher pressures)...

re: UPLC pricing

Posted: Mon Nov 08, 2004 6:18 pm
by DR
In the interest of fairness, it should be mentioned that a UPLC system requires Empower software. To those not already running Empower, this does make a considerable difference in overall system pricing.

I have found this to be a very good thread and I hope its participants can continue to be civil while debating their issues.

When I first read that the UPLC systems require Empower, I wondered if calling them UPLCs instead of [insert marketing term here]-HPLCs allows Waters to drop CDS compatability deals with the other CDS vendors and force UPLC users to move to Empower.

While I look forward to using a UPLC and I understand the motivation for limiting CDS options with UPLC, I have some reservations about this approach. If I were Agilent, I'd be annoyed.

Re: re: UPLC pricing

Posted: Mon Nov 08, 2004 7:11 pm
by tom jupille
If I were Agilent, I'd be annoyed.
Not really; they probably do the same thing when a new system comes out. :wink:

Re. UPLC software compatibility

Posted: Tue Nov 09, 2004 12:56 am
by Hugh
It should be pointed out that the UPLC instruments are fully supported on MassLynx software as well as on the Empower platform.

Re: Re. UPLC software compatibility

Posted: Tue Nov 09, 2004 4:50 pm
by DR
It should be pointed out that the UPLC instruments are fully supported on MassLynx software as well as on the Empower platform.
Right - If you want a UPLC, a Waters CDS is a prequisite. That's what I'm less than enthusiastic about despite my positive impressions of their software.

Posted: Wed Nov 10, 2004 9:04 am
by Alex Buske
Quite an intersting thread. Just a few thoughts on the topic
Hplc instrumentation has matured over the years, there is reliable instrumentation on the market,
there are PDAs, that are as sensitive as single wavelength detectors,
there are fluorescence detectors, that monitor different wavelengths simulaniously,
there are autosamplers, that do sample preparation like dilution an derivatisation and have cooling option, ecxellent injection linearity and no carryover,
there are column thermostats, that provide cooling and or higher temperatures,
there are pumps, that show ecxellent accuracies, that are easy to maintain, that have just a few µl delay volume, there are even pumps pumps with two low pressure gradient pumps. that can recycle a column after gradient, .while the second column runs the analysis and
there are sofisticated data systems that are easy to use and/or can controll every single parameter, and/or are secureand/or provide every thinkable option.
Many of the achievments (except on software) are made with simple solutions.

However, in the real world it is impossible to get a system with top performance modules from one vendor controlled by one suitable software. Usually you will get one or two top performance modules and the rest is just standard performance or less. I do think integration is an major issue in the next years.

On the MS vs PDA issue: Our single quad instrument is quite seldomly used for (routine quantification) analysis because:
-sensitivity is far to low for our applications, compared to single wavelength UV or fluorescence detection. it is even lower in the full scan mode.
- setting up and adjusting is a pain
- even if i would I couldnt develop a method, because i coudn't transfer it to QC
I dont think, MS is the ultimative detection choice. It surely will not replace UV, fluorescence or other special detectors.

From what I have read above and what I've read before, achievable resolution (or speed) on a column depends on the pressure. Smaller particles give better resolution and higher back pressure; longer colums give higher reolution and higher backpressure...
However, when plotting effiencies of different columns vs. back pressure (same test application, same syste, column lengths similar) I get a strange pichture: the majority of columns is placed in a group that goes from low Pressure /low efficiency (5µ particles) to higher pressures / higher efficiencies (3 µm in different qualities). But there are some columns that perform like 3 µm and have back pressures like 5 µm. Is it narrower particle size distribution, polished column surfaces, perfect round particles ? Next thing I noticed is the effieciency on a unretained peak (actually peak width and thuse dispersion in the column, but it is easier to express it as N with our software): for most of the columns it is aroung 1000 - 1500, for some around 3000-3500. There are obviously differences in the column design. So even with "convetional" particle sizes (3, 4 and 5 µm) there are improvements possible that lead to higher resolution or shorter run times. Maybe someone could comment on that.

Working in a development lab, UPLC is actually no choice for me, because there is just one column chemistry available (two o three other companies sell sub 2µm-Particles) and method transfer to the QC-lab with conventional HPLC-equipment would be problematic.
(Silica gel) monlithic columns are interesting, but there has been little development since 1999, the number of chemistries is limited, there seems to be a issue with peak symmetry and available dimensions are suited only for fast chromatography. I've heard of people that coupled 10 columns (100 mm) without exceedig the system pressure limit. Efficiency must be great, but don't think about the price!

alex

Posted: Wed Nov 10, 2004 7:08 pm
by Kostas Petritis
Alex,

I will comment on your MS vs PDA comments.

For routine quantification methods, a triple quadrupole is much more adequate due to it's higher sensitivity and specificity (depending on the molecule you may have several orders of magnitude more sensitivity), so in general it is much more sensitive than UV and sometimes more sensitive than fluorescence detection (again it depends on the molecule, but most of the times is the case). For full scan sensitivity an ion trap or ToF are more adequate choices.

About your comment "setting up and adjusting is a pain" this is in direct relation that you "quite seldolmy use your MS". There are labs that use their MS 24 h per day 7 days per week... There is a tendancy lately to try to make the MS like "black boxes" (autotune capabilities, very limited access to ion-optics values -autooptimized-, ion-sources that you can easily use conventional 1 mL/min flow rates without sensitivity loss etc...)

There are plenty of methods that are using MS and are validated. I guess that most of the published methods nowadays are using MS.

History will judge the present thread... :wink:

Posted: Wed Nov 10, 2004 8:34 pm
by tom jupille
Kostas, I'll jump back in here (with both feet :wink: )
I guess that most of the published methods nowadays are using MS.
I just did a quick "brain pick" with a former editor of major chromatography journal; I'll paraphrase his response as "probably true of Analytical Chemistry, certainly not true of J. Chrom(A), maybe half and half for J.Chrom(B)". In any case, published methods are likely to overrepresent new technology and underrepresent old established technology.

I tend to think in analogies:

MS is a Porsche 911 GT2. It's fast, it has superb handling, it's cutting-edge technology . . . it's expensive, ia good deal of expertise is required to exploit its full potential.

UV is a Toyota Camry SE. It's not particularly fast, and it won't win any races, but it's rugged, reliable, and reasonably priced. And there are a lot more Camrys than 911s on the road.

You're involved with cutting-edge technology, but I'd venture a guess that the bulk of HPLC runs involve things like dissolution testing, content uniformity, forumation potency assays, etc. Things where sample size is large enough to be approximately unlimited, where concentrations are up high enough that the real question is how much to dilute, and where chromatograms have anywhere from one to six peaks. For those applications, MS detection -- at current price and complexity levels -- is overkill.

Ultimately, what we produce is not a chromatogram, but information; chromatography is merely a means to an end. Over time, there will be a much bigger gulf between chromatographers and users of chromatography (to go back to analogies: think of personal computers; there are a lot more PC users than programmers). There will be a cadre of chromatographers working on the leading-edge stuff (as you are now), but most users will have a small benchtop beige box with a one-button interface (the button will read "Analyze").

Take a look at some of the stuff Dionex is doing with ion chromatography hardware as an example of that direction.

Your turn! :)

Posted: Wed Nov 10, 2004 11:17 pm
by Kostas Petritis
Tom,

You are about right for the publications with validation MS vs other detectors.

So, a quick look at the 2003-2004 publications with key words such as:
validation or validation and mass spectrometry for the three journal you mentioned:

J. Chromatogr. A MS: 35, other detectors 62
J. Chromatogr. B MS: 57, other detectors 60
Anal. Chemistry: MS: 18, other detectors 15

For all journals for the last two years:
Key words here: validation and chromatography -for all the other detectors-, validation and chromatography and mass spectrometry -for MS- (I substract always the first from the second to come with the numbers).

MS: 336, Other detectors: 452

The numbers are very close, my point is that mass spectrometry is highly used for quantitative method validation and not just for structural determination.

I also gave several examples of MS instruments price decrease (i.e. ToF's for less than 100K $ and for mass spectrometers much easier to use (black boxes) where you just infuse your calibrant and you push the buttom "calibrate", you infuse your compound of interest and you push "autotune" and it gives you the best MS-MS conditions to use and ion-sources that you just plug your LC and you push "Analyze". The need of biologist and biochemists for such devices have pushed MS companies to oversimplify their instruments.

One of the best examples is the LTQ-FTMS from Thermo. People that have worked with ion-traps and especially with FTICRs (!) know what I am talking about.

If you want a car analogie, the MS of the future won't be a Porsche 911 GT2 but more likely an Audi TT (still kind of expensive but not as expensive as two Camrys... I think...), it will be equiped not with manual but with automatic transmition so it will be easier to drive (won't require high skils to exploit its full potential) etc etc...

I agree that there will be much bigger gulf between chromatographers and users of chromatography as there will be mass spectrometrists and users of mass spectrometry (among them a lot of chromatographers).

Again, this thread is about the future so I guess that I have the time working for me ;-).

Posted: Thu Nov 11, 2004 12:35 am
by Uwe Neue
Alex,

The measurement of column efficiency based on an unretained peak is not a good practice in general, since under these circumstances the measurement is dominated by extra-column effects. You can get very misleading results.

THE UPLC system discussed above has one advantage over classical HPLC systems: it has a much lower extra-column badnspreading than classical systems. This is forced by the higher column performance of UPLC columns.

Posted: Thu Nov 11, 2004 9:53 am
by Victor
Tom makes a good point about "chromatographers" and "users of chromatography"; the former is someone who really understands what is going on above and beyond knowing how to press the button which says "Analyze".

My question is; what does everyone think the market is for new PhD graduates in the former category? Do you think the employability of such people is declining whereas the market for button pressers is holding fast, or maybe increasing? If all the development work is being done by say Dionex or an equivalent company in another field will they become the only employers of (a few) chromatographers? What is the job market like in Europe now for these people? And in the USA?

My impression is that much of the research in mass spectrometry has been this way for many years but perhaps for a different reason. The technology was too complex and expensive for university people to mess around with, and chemists did not have the engineering departments (or willing engineering departments!) available to co-operate in such projects. There are of course notable exceptions!

Posted: Thu Nov 11, 2004 4:24 pm
by Ron
Victor,

I think the job market for people who know how to do more than simply push the buttons will always be good if they do not limit themselves to being a specialist in a very narrow area. What most employers are looking for is people with good problem solving skills, and the technical background to understand the problems the employer is trying to solve. If a new graduate is promoting theoretical understanding of chromatography it might be hard to get a job offer, where if the focus was on understanding separations to develop methods to perform difficult analysis job offers will probably be plentiful.

In today's world very few people will be employed to do basic research, almost everyone will be doing goal oriented research, unfortunately including many universitly programs that are highly dependent on outside funding for survival. The chromatographers who can use chromatography to solve immediate problems will thrive, while those caught up in theory more than practical applications will have a harder time of it.

My view of the future is a few people with a good understanding of what they are doing developing the methods used by many button pushers who don't really understand everything going on. I know this sounds a little pessimistic, but isn't that what is happening today for the most part?