So many published protocols have unpublished problems?

Off-topic conversations and chit-chat.

24 posts Page 1 of 2
An ongoing trend I've found is I will find a paper with a seeming simple protocol and beautiful results, try them in my lab, and spend weeks banging my head against the bench. It seems to happen to me over an over again.

some examples
I find a paper doing 3-mcpd by derivitizing with pba. Make a 10ml solution in saline and add 0.2ml of derivitizing solution. I do that and do not get the published sensitivity. I run a bunch of experiments all week and find I need to add a full 1ml to get maximum sensitivitiy. I then find 3 older papers that are using the exact or similar portion or reagent to sample mix volume (1-2ml per 10ml).

I find a couple of great papers that show ethyl chloroformate derivitization which is pretty good for most amino aids can be extended to the citric acid cycle acids with lovely chromatograms. So this week I try that and I find fumaric acid forms diethyl fumarate and its cis isomer diethyl maleate, citric acid froms three peaks due to isomers and the hindered teriary alcohol, and I come accross a thesis that states yea I saw the same thing but none of the papers mention it.

Has anyone else seen this?
It appears that you don't have hands-on experience with the USP assays. It's enough to make you tear out your hair (if you have any).

Almost everything in writing in the USP is open to interpretation. Stuff like "standardize frequently" for a titrant - any guidance what interal that means? Or to do such standardization in singleton, duplicate in triplicate? Or how close to each other such values need to be? Or one can use a non-USP standardization procedure of equivalent accuracy, but no one knows what equivalent accuracy is.


Yes, since this is a chromatography forum: we periodically have issues with published procedures. Most often, we actually buy the exact same column as used in the publication, to start, even if we have a similar type column here, just to rule out any discrepancies.

I've published myself, and once a reviewer asked if a filtration was made using gravity or with suction, as I didn't include that as I wasn't reading it as an outsider.

My company was acquired a decade ago, and we have "typically horrible" raw material test methods shoved down our throats as "perfect". One of my "favorites" is one which uses 20 mg sample size of an amine raw material and is titrated with 0.1 N acid on an autotitrator, and the specifications are super-tight range. A 0.0001 g weighing difference can put the result out of specifications. One could easily use a much larger sample size and stronger titrant and a colorimetric endpoint (even a USP indicator) and improve results. Or weigh a sample and take to volume, and titrate an aliquot. But HQ won't admit that ANYTHING is wrong with any of "their" procedures; some seem like no one ever actually tried out in the laboratory. Let's just say that in comparison: USP looks good!
I don't think that I have ever had a literature method work exactly as published. In same cases I developed a strong opinion that the authors never got it to work either. I am always profoundly suspicious of results from a chromatographic method when the paper does not include any chromatograms.

Peter
Peter Apps
Yep I can definately relate to inheriting third rail methods. We've used this procedure forever, touch it and you die.

But the dilution calculations are incorrect...
don't change it

The resolution on the GC column is very poor...
don't change it

But the recovery of the analyte off the SPE column is 50%...
don't change it

A more modern method came out that would save us a ton of time and money
Ok let's look at it.
Only yesterday someone brought me an LC-MS method where the authors have specified the exact instrument they used, but not the column or solvent-system.

Yes, there are two quite separate things that aggravate me horrendously. (1) Methods that are so incomplete you can't repeat them. Really it's terrible reviewing and editing, but a lot of editors seem to care only about the page-count, and whether the article looks super-attractive; chemistry papers don't seem to feel that chromatography is worth description. (2) Methods that don't work.

It's really hard to deal with methods that don't work, because there's no obvious way to mark a published faulty method as faulty. Quite often there's a divergence: a group of researchers pass a paper-copy of their method from Mother to Son, and Father to Daughter, and they all cite the original publication where the first work was done, but no one actually uses the method described in that first paper, and no one ever goes back to look at it - they're all using a grubby photocopy of a Word document that's long since lost.

Once something's been published, it's set in stone. If the initial authors made a huge mess with their solvent preparation, or used a column so ancient that it no longer operated even vaguely like it should have done, then everyone still cites that original paper for ever more, even though it's wrong wrong wrong.

Another thing that happens is that people repeat something 6 times because the first 5 times it goes wrong (i.e. doesn't work as they expected), so you end up with a method/set of results that reflect one experiment (which only "worked" because someone made a mistake) rather than 5 experiments that "failed" (because actually the hypothesis was wrong).

All this, of course, is before we even start to discuss corner-cutting under pressure, and outright fraud.

I'm utterly convinced that if there were a big red button that would make all papers vanish if they contain serious mistaken, unrepeatable data, and were we to press it, librarians would die in the inrush of air into the vast vacuums created. Shelves would clear themselves. Most libaries would be reduced to a handful of bound volumes (containing my work, of course!)
lmh wrote:
Only yesterday someone brought me an LC-MS method where the authors have specified the exact instrument they used, but not the column or solvent-system. like you I operate in the misty swamplands along the border between chemistry and biology - and this kind of omission is a dead give away for biologists having run their samples by loading the autosampler on an instrument that they have no clue about. The other mark of the beast is when a method carefully specifies something that makes no difference at all to the results. I can understand how a biologist writing up the paper might get it wrong (as a chemist writing about biology might) but how does it not get spotted by the chemist co-authors, the editors or the reviewers ?


I'm utterly convinced that if there were a big red button that would make all papers vanish if they contain serious mistaken, unrepeatable data, and were we to press it, librarians would die in the inrush of air into the vast vacuums created. Shelves would clear themselves. Most libaries would be reduced to a handful of bound volumes (containing my work, of course!)
:lol: :lol: :lol:

Peter
Peter Apps
Peter Apps wrote:
I don't think that I have ever had a literature method work exactly as published. Peter


I guess I'm fortunate in my business, I'd say more than half of the published applications I've tried seem to work out, and then I modify them to be more straightforward.

Peter Apps wrote:
I am always profoundly suspicious of results from a chromatographic method when the paper does not include any chromatograms. Peter


And same goes ditto for chromatographic test procedures from the raw materials' suppliers. My company's test procedures ALWAYS include chrmomatograms from standard, real-life samples, sometimes a blank. We also detail part numbers for columns, reagents, and stuff that we think may be tougher to find.

We figure out how many grams or ml of an acid or base needs to be added to "get to" a specified pH for mobile phase (checked out in robustness studies) so the iffy task of going to a pH with a pH meter (which includes qualifying such pH meter) can be avoided. Then we do that in the validation and written test procedure; we know that employees at "some" contract manufacturers may not be rocket scientists, we try to make our preparation steps simple and straightforward, build the accuracy into the procedures themselves, even for stuff as "simple" as weighing and mixing.
For sure - the easier you make it for people to do it the way you want it done, the more likely they are to do it that way.

Peter
Peter Apps
oh, don't get me started on (a) how many skilled post-docs can get a 3mM solution wrong (probably including me), and (b) how many different solutions you'll get (purely different interpretations, excluding blatant mistakes) if you ask for 10% X.
Heck one of my precessors thought that if you take 3g of sample and 9g of water you have a dilution factor of 3.

Some of the papers I've seen are demonstrably wrong and covering up the flaws to get published like showing a nice clean chromatogram of ethyl fumarate with no ethyl maleate. I think gosh I must really be doing something wrong. Then I find a thesis where the person admits yea I got isomerization too and wasn't able to do anything about it.
I'm cleaning up my office. I found this application from Merck SeQuant AB in Sweden: "Rapid identification of diethylene glycol in a mass poisoning epidemic using ZIC-HILIC Chromatography" which uses MS detection. Not only do I believe that GCMS is a better and easier identifier for DEG after trimethylsilyl derivatization, the funny thing is that the structure of diethylene glycol in the application is just incorrect.
This reminds me of the Twitter hashtag #overlyhonestmethods

http://storify.com/BeckiePort/overlyhonestmethods

Some of them are pretty good.
MichaelVW wrote:
This reminds me of the Twitter hashtag #overlyhonestmethods

http://storify.com/BeckiePort/overlyhonestmethods

Some of them are pretty good.


As a matter of interest, if these were a survey, how many of them have we all come across in real life (committed ourselves or seen in friends and coworkers?)? It'd have made a great survey!
Well If I'm fortunate to have several papers on a given method my way of developing a method is usually to look for consensus, pick out the bits and pieces of each I like and combine them with ideas of my own. However, It would be a revelation if I ever downloaded a method that was well designed and actually worked as published.
MichaelVW wrote:
This reminds me of the Twitter hashtag #overlyhonestmethods

http://storify.com/BeckiePort/overlyhonestmethods

Some of them are pretty good.


:lol: :roll: oh science and its researchers, you never cease to amaze me... I'm going to have to start following #overlyhonestmethods as it is a great source of humor!
BHolmes

Any problem worthy of attack, proves its worth by hitting back...never give up!
24 posts Page 1 of 2

Who is online

In total there is 1 user online :: 0 registered, 0 hidden and 1 guest (based on users active over the past 5 minutes)
Most users ever online was 1117 on Mon Jan 31, 2022 2:50 pm

Users browsing this forum: No registered users and 1 guest

Latest Blog Posts from Separation Science

Separation Science offers free learning from the experts covering methods, applications, webinars, eSeminars, videos, tutorials for users of liquid chromatography, gas chromatography, mass spectrometry, sample preparation and related analytical techniques.

Subscribe to our eNewsletter with daily, weekly or monthly updates: Food & Beverage, Environmental, (Bio)Pharmaceutical, Bioclinical, Liquid Chromatography, Gas Chromatography and Mass Spectrometry.

Liquid Chromatography

Gas Chromatography

Mass Spectrometry