Do you still use Excel for calculations?

Discussions about chromatography data systems, LIMS, controllers, computer issues and related topics.

16 posts Page 1 of 2
Even with Empower 3 offering a huge amount of in built calculations and custom fields, so many companies even now in 2018 transfer values from Empower into Excel to do calculations that could very easily be done in Empower, Does your company still do that? Mine still uses the Amount function for Assay tests and transfers into Excel to do the conversion and dilution and label claim when a custom field will do the one thing.

Its strange that with so much information out there on Empower companies are still going backwards to spreadsheets. Not all of them of course (I'm sure many many companies are 100% electronic) but there is still a ways to go..
Agilent ChemStation & OpenLab 1 and MS Excel. Empower is too expensive.
Best regards,
Dmitriy A. Perlow
Yep - still in the late 20th century. Integrators gone, Excel in use, as are pocket calculators(!). Change is coming for us...
Thanks,
DR
Image
Since passed century our UniChrom proposes the following extended calculation model.

1. In Excel have to be installed special add-in - unichrom.xla (shipped with UniChrom in ./Reports directory)

2. On the clean Excel sheet you can write
=GetPropValue(-1;"Name") - returns name of the UniChrom workbook. Attributes are per-chromatogram and per-workbook

=PeakIndex(1;"C22:0") - returns the index of peak named "C22:0" in the layer #1

=PeakParam(1;23;4096) - returns the area of peak with index 23 of layer (chromatogram in batch) #1 ,4096 - constant intended for peak area
=PeakParam(1;23;1) - returns the name of the 23-th peak in 1st layer

There are ~ 30 peak attributes which can be obtained that way

=InsertPicture(-1;180;90) - insert a chromatogram graphic 180x90mm from layer #1

So no data export, just get live data from opened documents. To repeat the report - open UniChrom rawdata workbook and Excel workbook.

Recent developments unichromw.xla (for Excel) and unichromw.oxt (for LibreOffice calc) use JSON-RPC call of built-in UniChrom web server. Mostly the same functionality.

Also, forgot to mention, UniChrom has its own single-page spreadsheet named "Calculator" (stored with data workbook) which can do essentially the same without any external programs.
so much depends on your environment. In an academic rather than industry environment, we've got a customer-base who expect to be able to move their data into R, Matlab, proprietary 'omics software, not to mention getting quantitative data into whatever their favourite graphing package happens to be - or possibly trying to make the whole thing visible on a web interface. Excel is just one of many calculating tools they'll use along the way. There's no way that a CDS will ever offer all the functionality they require. Waters and others have tried it, and create very nice pipelines, but you always get someone who remembers that some part of someone else's pipeline does their particular job better. Also, we can't afford to tie ourselves to a single manufacturer, so we don't really want to commit ourselves to a software solution that only works for one family of instruments.
But even when our users just want to do the stuff that Empower and other packages can do, you have to look at the likely sources of error. An experienced Excel user with only a dabbling interest in chromatography is less likely to mess up a calculation in Excel than in Empower custom calculations, and his colleagues are more likely to spot the problem in a workbook that contains the calculations. I encourage those who use my lab to make use of the various systems' abilities to create calibration curves and carry out quantification, but I don't think it would be time-efficient for them to do much more in the CDS. Also, if their QC checks, and sample dilutions, are embedded in software that we perhaps abandon in a few years when we change instruments, no one can decode later what they did (yes, we can keep the old software, but then Microsoft helpfully discontinue support on WinXP or whatever it is, and we find we can't actually run the software we kept, to open the data we kept, so in effect, our archive is gone). Excel may be universally distrusted (even the R people loathe it), but it's been around for a while, and it doesn't look like it's vanishing soon. The R people would far rather us get into comma-separated-values as soon as possible, and stay there!
For what it's worth, I've still got a slide-rule under the desk...
lmh, a thoughtful post and I found myself nodding along in agreement with each of your points.
lmh wrote:
For what it's worth, I've still got a slide-rule under the desk...


I'm retired, but have two yellow plastic slide rules at home...not sure that I still know how to use them either.....not sure I even remember what a logarithm is !!!

Bet these get listed but don't sell that often on Ebay...
Thanks for the replies everyone, very interesting to read different viewpoints.

I don't agree with not using CDS software to its absolute maximum as I believe that unless we embrace change and increase efficiencies using validated calculations, we will never move forward with the times. I have had to argue my case with former managers to use custom fields to calculate assay tests instead of transcribing tonnes of data from Empower into a spreadsheet with lots of formulas and a high chance of error, when Empower only had to be tested once to see if it worked and its done! Some managers were on board, others were too set in their ways.

Personally I think you should use 100% of the features Empower offers.
The problem with maxing out a given CDS features and calculations is that you quickly lose the ability to explain it to auditors and regulators whose blessings are required, once you start going overboard with custom fields and boolean stuff.
Thanks,
DR
Image
There's also a vast difference between CDS features such as custom fields/calculations, and CDS features such as straight-forward fitting of regression lines to calibration data.

If you use your CDS to carry out calibration, it is validated out of the box. It's also more powerful and flexible than Excel (because it can handle weightings, and it's much easier to do quadratic fits if necessary, etc.).

If you use your CDS to do custom calculations, it's not validated. It's no more validated than Excel. It's validated in the sense that if you ask it to calculate a+b someone will have proven that the result will be numerically correct, and a+b, but it's not validated in the sense that you've got to prove that a+b was the correct calculation, and that the syntax you've used is correct under all circumstances - and that is exactly the same situation as Excel.

For this reason, I'm strongly inclined (in a general academic environment) to make the transition-point from CDS to other-software happen at the point just after where you've got calibrated values from a calibration curve, and just before you start calculating means and standard errors, or carrying out more complex calculations to allow for strange sample-specific treatments during sample-prep etc. (though I'll admit simple dilutions may be easier to handle in the CDS). A lot does depend strongly on the people involved, and the environment. In a very analytical environment with skilled analytical staff it might make sense to do more in the CDS.
DR wrote:
The problem with maxing out a given CDS features and calculations is that you quickly lose the ability to explain it to auditors and regulators whose blessings are required, once you start going overboard with custom fields and boolean stuff.
Draper and Smith have an algorithm for doing regression by steps which works great for spreadsheets and maintains sufficient precision to calculate geological age dates. The Excel functions do not have sufficient precision. I don't have this CDS software. What guarentees that the accuracy and precision of what is put in the custom fields is any better than a spreadsheet?
Empower comes validated out of the box whereas Excel, to make a validated spreadsheet you have to put time into design, testing and then ensuring that the users are trained to operate it correctly. With custom fields, once your company has a robust SOP which references how to set up and validate calculated custom fields, it will work all the time. yes, of course it depends on if the analyst puts the right label or figure into CConst1 etc but the overall benefit far outweighs the time wasted transcribing figures from Empower into Excel, getting it checked and reviewed by up to 3 more people, hence more time lost.
With a custom field, the calculation is checked and signed off then released for routine use. It can then be thrown into a report and signed off, done no more paper or wasting time checking. It needs a strong second check before a run which depends on custom fields is activated but again, the bigger picture is important here. Not every lab manager wants to leave the past behind though!
Hello,

I heard through the grapvine that Excel shpreadsheet even validated and locked are no longer permitted by the FDA. Don t know if somone else has heard of this. I m looking on the web and FDA webpage for information but can t find nothing could this be a bad rumor?

Thanks,
boby
This has been rumored for years. I have no first-hand experience either way with FDA reviewers and spreadsheets.

As with part 11 compliance, things are way behind schedule as far as I can tell.
It all depends on how they're used and whether they're audited.

One lingering concern is that labs do not have access to source code in the case of Excel. MS will not show it to the agency for you if they are asked.

Practically speaking, if you use spreadsheets, you should be willing to have calculations checked or come up with bullet-proof validation and security protocols that are repeated with every update to the spreadsheet software.
Thanks,
DR
Image
>One lingering concern is that labs do not have access to source code in the case of Excel. MS will not show it to the agency for you if they are asked.

What is most impresses me in this "validation game" is the attempt to lock one software component completely closing eyes on the others.

1. CDS are running on general purpose office computers.
2. If you have validated particular version of CDS and even obtained the source code of CDS and audited it - what version of the components were used during validation:

* CPU - the processor number and microcode version, is fixed the MELTDOWN or SPECTRE1,2,X vulnerability? If changing CPU chip (burned away due to cooler failure) require revalidation?
* OS version and OS service pack. If recent security update has not changed the CPU driver and FPU exception handling which can change floating point rounding mode?
* CDS is built upon a framework (.NET , QT, etc) - what version of framework was used for validation. If updates and security fixes of framework requires CDS revalidation?
* CDS uses hardware drivers and network stack. If the system validated with Network card X has to be revalidated with Network card Y (they definitely use different drivers, the card X burned and replaced with card Y)?
* CDS is a process in OS memory. If antivirus software which known to be intercepting network traffic and even replace certificate on-fly has to be validated with your CDS with everyday AV database and scanning engine update?

For me as a technical person - considering CDS like a electrical power counter or like a petrol fuel counter looks like a cargo-cult.
We are observing ceremonies without making any glue what it is intended for.

In the terms of information theory - "black box" model better fits the validation of instrument + CDS approach.

Known input -> Known output, check the "->" functional relation.
16 posts Page 1 of 2

Who is online

In total there are 3 users online :: 0 registered, 0 hidden and 3 guests (based on users active over the past 5 minutes)
Most users ever online was 234 on Tue Feb 06, 2018 7:33 am

Users browsing this forum: No registered users and 3 guests