> Excel: Why using Microsoft's tool caused Covid-19 results to be lost > https://www.bbc.co.uk/news/technology-54423988
> The badly thought-out use of Microsoft's Excel software was the > reason nearly 16,000 coronavirus cases went unreported in England. There are many spreadsheet errors. Microsoft has been pushing its software for sales purposes. They also do not give any warranty for software. They would maybe give if paid well. Normally organizations assume that software works as it should. > And it appears that Public Health England (PHE) was to blame, rather > than a third-party contractor. I did not read the article, I know many other articles on spreadsheet damages, it is in millions or billions. Just write spreadsheet damages in search engine. And I do not know if Microsoft was third party. Normally their software is without warranty. I also think that ultimate responsibility is not on software maker or provider, unless there was very specific contract regulating their obligations. Responsibility for organization is on its managers. Managers decide which software to purchase. Could managers use search engines to find reviews on software? Yes, they could. Could they hire professional to support the internal operators in handling spreadsheets? Yes, they could. Could they invest in computer education of their staff members? Yes, they could. Spreadsheet as such can be great tool to process information. I remember using them since the Apple //c. Many organizations depend on such. But they always had errors and are prone to human errors. It is very easy to make typing mistake. That is why spreadsheet should be used by people educated on how to use it properly. Not how to use it on beginner or intermediate level, one has to Thus I would blame the lack of education and lack of responsibilites in management. > They filed their results in the form of text-based lists - known as > CSV files - without issue. CSV files imports are prone to many mistakes. I have got government databases from US states and such databases had errors inside. One has to describe procedures of export, import, quoting very nicely for the other party to succeed in importing. CSV files are simply not reliable for large data export and import. This may not been error at their side, yet is mistake in using such. > PHE had set up an automatic process to pull this data together into > Excel templates so that it could then be uploaded to a central > system and made available to the NHS Test and Trace team, as well as > other government computer dashboards. For me is unthinkable that people's data is in spreadsheets. Such data has to be in highly reliable databases that have well structured information and verifications of data input. > The problem is that PHE's own developers picked an old file format > to do this - known as XLS. > As a consequence, each template could handle only about 65,000 rows > of data rather than the one million-plus rows that Excel is actually > capable of. So bad! But blame is not on Microsoft or proprietary software, it is on developers who lack skills to handle their data professionally. It would be quite normal at each database upgrade to verify at least number of records of each table, if not making the hashes for each record and verifying such. Recently I have upgraded PostgreSQL and among three methods of upgrade, first one looked just fine, but some tables were missing. I was surprised. Then I have used the second method and succeeded. Why developers did not verify at least number of records before going to use such spreadsheets in actual work? That is for me unthinkable. > Recently I needed to send some photos to city hall. I sent Email with > a tar file containing 12 jpeg images as an attachment but it bounced. > So I asked the person in charge why this happened. She said that they > had a limit in their mail server. I asked what was the maximum > message size allowed and she answered: "Five giga (bytes)." I said > "Now five gigabytes is very large. Are you sure?" To this she > replied: "I heard that it was five something and I can only recall the > number five." This came from a city official known to be highly > competent and well respected by co-workers. Assuming that the actual > limit was five megabytes, I re-sized the images so that the entire > Email would smaller than that, and sent it out. This time the mail > did not bounce. There are many examples like that. What I do when I wish to share files is to upload them on server, make password if necessary and then give links to remote users. That way no files are transmitted over email line. On email, I am always optimizing JPG files before sending, almost never send original images. Back to spreadsheets: I do not think that Libreoffice is reliable. I remember that software since its inception as Star Office as proprietary program, and I have actually used it back in time before 1999 and my meeting with GNU. In fact I started making many files in StarWriter and formats were not free, it was difficult to switch to free formats. It was also difficult to find spreadsheet programs on GNU/Linux distribution back in time. There were errors back then and there are errors today in Libreoffice. That is nature of software and human errors. https://ask.libreoffice.org/en/question/64895/how-do-i-fix-a-libreoffice-document-that-is-corrupted/ Internet is full of references that spreadsheets are not reliable for larger data processing. https://incisive.com/spreadsheet-error-horror-stories/ readable with LibreJS https://www.lifehack.org/articles/technology/10-common-spreadsheet-mistakes-youre-probably-making.html readable with LibreJS https://riskonnect.com/risk-management-information-systems/why-managing-risk-with-spreadsheets-is-risky/ -- Thanks, Jean Louis ⎔ λ 🄯 𝍄 𝌡 𝌚