On 2 Sep 2012, at 8:15am, Keith Medcalf <kmedc...@dessus.com> wrote:

> On Saturday, 01 September, 2012, at 20:28 Ted Rolle, Jr. wrote:
> 
>> Remember Y2K?  That was caused by a three-letter blue company.  They
>> wanted to save 1 (one!) byte by not storing the century in critical
>> operating system fields.  The comments were (1960s) "Well, we won't be
>> around to fix it...wink, wink, nudge, nudge."  I was.  Most companies
>> got through it with few problems --- a tribute to the programming staff.

Only slightly true.  IBM was supplying hardware but (Global Services) wasn't so 
big at the time, a lot of it was Arthur Andersen and EDS and they were both 
terrible.  Also it's worth remembering that after all the fuss absolutely 
nothing terrible happened when y2k rolled around.  Everyone had been 
appropriately alert and diligent and almost all problems had been taken care of 
in time.  A couple of things slipped through, none of which (as far as I know) 
caused a single death.

> Insistence on using two-digit years survived up to the early 90's, and the 
> Operating System wasn't the problem really the problem.  The real problem was 
> applications and data being and manipulated and stored "century-free", by 
> people who ought to have known better.  This still persists today with people 
> who insist in recording and displaying dates in formats such as 06/07/02.  
> Does this mean 06 July xx02; June 7, xx02; or 2 July xx06 (with 
> guess-the-century).

Take it from someone who was there at the time.  By 1990 companies were 
cynically leaving out century numbers in the hope that they'd be able to charge 
somebody some money somehow when it became necessary to make their code y2k 
safe.  I raised a stink when a terrible well-known programming company left out 
centuries in a job they were starting in 1989 for an international bank.  They 
backed down only when I reminded them my side had final wording on the contract 
and would write in whatever penalty clauses I thought appropriate.

I did see a company put century numbers in.  For a customer with whom they had 
a twenty year contract.

At the time we had 40 Meg disks on mundane home PCs, and 250 Meg disks on 
servers for small businesses.  You could put together multi-gigabyte disk 
arrays for two grand a gig.  There really wasn't any reason to save two bytes 
per date.

> The next most disgusting behaviour is failure (or rather refusal) to display 
> timezones. Extremely annoying are those folks who store datetime in localized 
> time rather than storing in UT1 and converting on input/output.

Annoying now.  Back then, converting to local time took a lot of CPU time.  
Doing the conversions necessary for showing a screenfull (probably 20 lines) of 
transactions with one date each involved a significant slowdown.  In fact many 
formats for data storage were governed not by what format was fastest for 
maths, but by the format wanted for displaying data on the screen.  All 
significant data processing was done in an overnight batch run anyway, so 
processing speed for maths wasn't as important.  I think almost all the systems 
I was aware of stored dates as YYYYMMDDHHMMSS in the local time of the client's 
HQ.

Simon.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to