Re: Pythonic Y2K
On Fri, 18 Jan 2019, Gene Heskett wrote: I had one client, a hedge fund, that I fixed literally 1000's of Y2K issues for. When Y2K came and there were no problems, the owner said to me "You made such a big deal about the Y2K thing, and nothing happened." -- I would quite cheerfully have bought a ticket to watch and hear your reply, Larry. My response would have been, "That's because of all the time and effort I devoted to fixing all the issues that would have put you out of business. Perhaps a bonus is due me?" This is a common situation for all consultants, including environmental scientists such as I am. Clients don't know how to fix the problems they face, nor have that capability in-house, so they need external assistance. This lack of knowledge means they don't understand why the project took so long nor recognize the effort put in by the consultant. People tend to take for granted when things work smoothly but notice when there are glitches. Ask any major conference or annual meeting director about this. :-) A similar situation is faced by those of us who are expected to "prove" a negative. Carpe weekend, all, Rich -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Friday 18 January 2019 16:55:28 Avi Gross wrote: > Larry, > > I keep hearing similar things about the Flu Vaccine. It only works 40% > of the time or whatever. But most of the people that get the flu get a > different strain they were not vaccinated against! > > There are hundreds of strains out there and by protecting the herd > against just a few, others will flourish. So was it worth it? > > Your argument would be that your work found lots of things related to > Y2000 that could have been a problem and therefore never got a chance > to show. I wonder if anyone did a case study and found an organization > that refused to budge and changed nothing, not even other products > that were changed like the OS? If such organizations had zero > problems, that would be interesting. If they had problems and rapidly > had their software changed or fixed, that would be another and we > could ask if the relative cost and consequence made such an approach > cheaper. > > But in reality, I suspect that many of the vendors supplying products > made the change for all their clients. I bet Oracle might have offered > some combination of new and improved products to replace old ones or > tools that could be used to say read in a database in one format and > write it out again with wider date fields. > > The vast difference some allude to is realistic. Y2K swept the globe > in about 24 hours. No easy way to avoid it for many applications. > Someone running python 2.X on their own machines may be able to > continue living in their bubble for quite a while. If you sell or > share a product with python frozen into an app, it makes no > difference. But asking some clients to maintain multiple copies of > python set up so one app keeps running as all others use the newer > one, may not remain a great solution indefinitely. > > Has anyone considered something that may be at the edges. How well do > cooperating programs work together? I mean if program one processes > and saves some data structures using something like pickle, and > program two is supposed to read the pickle back in and continue > processing, then you may get anomalies of many kinds if they use > different pythons. Similarly, processes that start up other scripts > and communicate with them, may need to start newer programs that use > the 3.X or beyond version as no back-ported version exists. The bubble > may enlarge and may eventually burst. > > -Original Message----- > From: Python-list > On Behalf Of > Larry Martell > Sent: Friday, January 18, 2019 10:47 AM > To: Python > Subject: Re: Pythonic Y2K > > On Fri, Jan 18, 2019 at 10:43 AM Michael Torrie wrote: > > On 01/16/2019 12:02 PM, Avi Gross wrote: > > > I recall the days before the year 2000 with the Y2K scare when > > > people worried that legacy software might stop working or do > > > horrible things once the clock turned. It may even have been scary > > > enough for some companies to rewrite key applications and even > > > switch > > from languages like COBOL. > > > Of course it wasn't just a scare. The date rollover problem was > > very real. It's interesting that now we call it the Y2K "scare" and > > since most things came through that okay we often suppose that the > > people who were warning about this impending problem were simply > > being alarmist and prophets of doom. We often deride them. But the > > fact is, people did take these prophets of doom seriously and there > > was a massive, even heroic effort, to fix a lot of these critical > > backend systems so that disaster was avoided (just barely). I'm not > > talking about PCs rolling over to 00. I'm talking about banking > > software, mission critical control software. It certainly was scary > > enough for a lot of companies to spend a lot of money rewriting key > > software. The problem wasn't with COBOL necessarily. > > I had one client, a hedge fund, that I fixed literally 1000's of Y2K > issues for. When Y2K came and there were no problems, the owner said > to me "You made such a big deal about the Y2K thing, and nothing > happened." -- I would quite cheerfully have bought a ticket to watch and hear your reply, Larry. Or better yet, silently reached into your briefcase and brought out an invoice, listing what and where you patched, and what you would normally charge to find and fix each one individually when the gun went off for real 36 hours back and his fund was losing 1% an hour. Sometimes the truth shuts them up but theres usually some yelling involved. > https://mail.python.org/mailman/listinfo/python-list Cheers, Gene Heskett -- "There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order." -Ed Howdershelt (Author) Genes Web page <http://geneslinuxbox.net:6309/gene> -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Fri, Jan 18, 2019 at 4:56 PM Avi Gross wrote: > > Larry, > > I keep hearing similar things about the Flu Vaccine. It only works 40% of > the time or whatever. But most of the people that get the flu get a > different strain they were not vaccinated against! That seems like a complete non-sequitur. What does that have to do with Y2K? But I will tell you something: I've never had a flu vaccine in my life and I've never had the flu. And I will tell you something else: I have proof that worrying works - 99% of the things I worry about never happen, so it must work. And please stop top posting. > There are hundreds of strains out there and by protecting the herd against > just a few, others will flourish. So was it worth it? > > Your argument would be that your work found lots of things related to Y2000 > that could have been a problem and therefore never got a chance to show. I > wonder if anyone did a case study and found an organization that refused to > budge and changed nothing, not even other products that were changed like > the OS? If such organizations had zero problems, that would be interesting. > If they had problems and rapidly had their software changed or fixed, that > would be another and we could ask if the relative cost and consequence made > such an approach cheaper. > > But in reality, I suspect that many of the vendors supplying products made > the change for all their clients. I bet Oracle might have offered some > combination of new and improved products to replace old ones or tools that > could be used to say read in a database in one format and write it out again > with wider date fields. > > The vast difference some allude to is realistic. Y2K swept the globe in > about 24 hours. No easy way to avoid it for many applications. Someone > running python 2.X on their own machines may be able to continue living in > their bubble for quite a while. If you sell or share a product with python > frozen into an app, it makes no difference. But asking some clients to > maintain multiple copies of python set up so one app keeps running as all > others use the newer one, may not remain a great solution indefinitely. > > Has anyone considered something that may be at the edges. How well do > cooperating programs work together? I mean if program one processes and > saves some data structures using something like pickle, and program two is > supposed to read the pickle back in and continue processing, then you may > get anomalies of many kinds if they use different pythons. Similarly, > processes that start up other scripts and communicate with them, may need to > start newer programs that use the 3.X or beyond version as no back-ported > version exists. The bubble may enlarge and may eventually burst. > > -Original Message- > From: Python-list On > Behalf Of Larry Martell > Sent: Friday, January 18, 2019 10:47 AM > To: Python > Subject: Re: Pythonic Y2K > > On Fri, Jan 18, 2019 at 10:43 AM Michael Torrie wrote: > > > > On 01/16/2019 12:02 PM, Avi Gross wrote: > > > I recall the days before the year 2000 with the Y2K scare when > > > people worried that legacy software might stop working or do > > > horrible things once the clock turned. It may even have been scary > > > enough for some companies to rewrite key applications and even switch > from languages like COBOL. > > > > Of course it wasn't just a scare. The date rollover problem was very > > real. It's interesting that now we call it the Y2K "scare" and since > > most things came through that okay we often suppose that the people > > who were warning about this impending problem were simply being > > alarmist and prophets of doom. We often deride them. But the fact > > is, people did take these prophets of doom seriously and there was a > > massive, even heroic effort, to fix a lot of these critical backend > > systems so that disaster was avoided (just barely). I'm not talking > > about PCs rolling over to 00. I'm talking about banking software, > > mission critical control software. It certainly was scary enough for > > a lot of companies to spend a lot of money rewriting key software. > > The problem wasn't with COBOL necessarily. > > I had one client, a hedge fund, that I fixed literally 1000's of Y2K issues > for. When Y2K came and there were no problems, the owner said to me "You > made such a big deal about the Y2K thing, and nothing happened." -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Larry, I keep hearing similar things about the Flu Vaccine. It only works 40% of the time or whatever. But most of the people that get the flu get a different strain they were not vaccinated against! There are hundreds of strains out there and by protecting the herd against just a few, others will flourish. So was it worth it? Your argument would be that your work found lots of things related to Y2000 that could have been a problem and therefore never got a chance to show. I wonder if anyone did a case study and found an organization that refused to budge and changed nothing, not even other products that were changed like the OS? If such organizations had zero problems, that would be interesting. If they had problems and rapidly had their software changed or fixed, that would be another and we could ask if the relative cost and consequence made such an approach cheaper. But in reality, I suspect that many of the vendors supplying products made the change for all their clients. I bet Oracle might have offered some combination of new and improved products to replace old ones or tools that could be used to say read in a database in one format and write it out again with wider date fields. The vast difference some allude to is realistic. Y2K swept the globe in about 24 hours. No easy way to avoid it for many applications. Someone running python 2.X on their own machines may be able to continue living in their bubble for quite a while. If you sell or share a product with python frozen into an app, it makes no difference. But asking some clients to maintain multiple copies of python set up so one app keeps running as all others use the newer one, may not remain a great solution indefinitely. Has anyone considered something that may be at the edges. How well do cooperating programs work together? I mean if program one processes and saves some data structures using something like pickle, and program two is supposed to read the pickle back in and continue processing, then you may get anomalies of many kinds if they use different pythons. Similarly, processes that start up other scripts and communicate with them, may need to start newer programs that use the 3.X or beyond version as no back-ported version exists. The bubble may enlarge and may eventually burst. -Original Message- From: Python-list On Behalf Of Larry Martell Sent: Friday, January 18, 2019 10:47 AM To: Python Subject: Re: Pythonic Y2K On Fri, Jan 18, 2019 at 10:43 AM Michael Torrie wrote: > > On 01/16/2019 12:02 PM, Avi Gross wrote: > > I recall the days before the year 2000 with the Y2K scare when > > people worried that legacy software might stop working or do > > horrible things once the clock turned. It may even have been scary > > enough for some companies to rewrite key applications and even switch from languages like COBOL. > > Of course it wasn't just a scare. The date rollover problem was very > real. It's interesting that now we call it the Y2K "scare" and since > most things came through that okay we often suppose that the people > who were warning about this impending problem were simply being > alarmist and prophets of doom. We often deride them. But the fact > is, people did take these prophets of doom seriously and there was a > massive, even heroic effort, to fix a lot of these critical backend > systems so that disaster was avoided (just barely). I'm not talking > about PCs rolling over to 00. I'm talking about banking software, > mission critical control software. It certainly was scary enough for > a lot of companies to spend a lot of money rewriting key software. > The problem wasn't with COBOL necessarily. I had one client, a hedge fund, that I fixed literally 1000's of Y2K issues for. When Y2K came and there were no problems, the owner said to me "You made such a big deal about the Y2K thing, and nothing happened." -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
ically at run time. Blue Screen of Death type of fail. You had to save your work regularly. Languages like python allow you to catch exceptions and deal intelligently with them to at least close down gracefully or even recover. Heck, many programs depend on this and instead of cluttering their code with lots of tests, wait for the error to happen and adjust in the rare case it does. So how reasonable would it be to still have lots of legacy software using languages and programmers that have no easy ways to make their programs more robust? How would such software react if it received information say in UNICODE? I predict that many people and companies have ignored warnings that the 2.X train would someday be diverted to a dead-end track. It was always far enough off in the future. But when a LAST DATE for updates is announced, some may sit up and take notice. It may literally take something like Insurance Companies (or the VC types) refusing to continue supporting them if they do not change, to get them the hint. And, over time, many companies do go under or are bought by another and often that will cause old projects to go away or morph. But there is nothing fundamentally wrong with using 2.X. As I said jokingly, if anyone wanted to keep it and support it as a DIFFERENT language than the more modern python, fine. -Original Message- From: Python-list On Behalf Of Michael Torrie Sent: Friday, January 18, 2019 10:36 AM To: python-list@python.org Subject: Re: Pythonic Y2K On 01/16/2019 12:02 PM, Avi Gross wrote: > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things > once the clock turned. It may even have been scary enough for some > companies to rewrite key applications and even switch from languages like COBOL. Of course it wasn't just a scare. The date rollover problem was very real. It's interesting that now we call it the Y2K "scare" and since most things came through that okay we often suppose that the people who were warning about this impending problem were simply being alarmist and prophets of doom. We often deride them. But the fact is, people did take these prophets of doom seriously and there was a massive, even heroic effort, to fix a lot of these critical backend systems so that disaster was avoided (just barely). I'm not talking about PCs rolling over to 00. I'm talking about banking software, mission critical control software. It certainly was scary enough for a lot of companies to spend a lot of money rewriting key software. The problem wasn't with COBOL necessarily. In the end disaster was averted (rather narrowly) thanks to the hard work of a lot of people, and thanks to the few people who were vocal in warning of the impending issue. That said, I'm not sure Python 2.7's impending EOL is comparable to the Y2K crisis. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 17/01/2019 02:34, Avi Gross wrote: but all it took was to set the clock forward on a test system and look for anomalies. You're new to programming or you're not very old and certainly haven't run much pre-Y2k software. ;-) Issues that needed solving: 2 digits only for the date use of 99 or 00 in the year for "magic" purposes software that didn't know 2000 was a leap year One machine I had to update (all in Z80 assembler) had the date in the format DD-MMM-199Y, i.e. only the units year could be changed. There was a comment in the date code saying ; marketing say this machine will stop being sold in 1993 so there ; is no need to support anything other than 1990-1999 in the year It was still being sold in 2002! The reason there were so few Y2K issues was that things were fixed in advance. And yes, there was also lots of stupidity and hysterics from people who didn't know. One major change, after people started fixing this in issue in earnest, dates started to always include the century digits. Now get off my lawn! :-) -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 2019-01-18, Dennis Lee Bieber wrote: > Hey... I'm still waiting for a novelization of the TRS-DOS date "bug". > TRS-DOS directory structure only allocated 3-bits for the year. Three bits for the year? they didn't expect those computers to last long, eh? [My current Thinkpad is over 10 years old.] -- Grant Edwards grant.b.edwardsYow! BARBARA STANWYCK makes at me nervous!! gmail.com -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Reminds me of a similar problem that didn't get noticed until it did actually hit: In 2007 the first time a group of F-22's crossed the international date line every computer system in the aircraft crashed, losing comms, navigation, avionics, and a host of other systems. Fortunately their engines, landing gear, and enough other systems still worked, so they were able to visually follow their refueling tankers back to Hawaii and land, where they had to sit for a couple days before Lockheed could patch their software. If the circumstances had been a little different they could have lost a whole group of shiny new $150 million aircraft to a software bug and left a bunch a pilots floating in life rafts for a while in the middle of the Pacific. -Original Message- From: Python-list [mailto:python-list-bounces+david.raymond=tomtom@python.org] On Behalf Of Michael Torrie Sent: Friday, January 18, 2019 10:36 AM To: python-list@python.org Subject: Re: Pythonic Y2K On 01/16/2019 12:02 PM, Avi Gross wrote: > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things once > the clock turned. It may even have been scary enough for some companies to > rewrite key applications and even switch from languages like COBOL. Of course it wasn't just a scare. The date rollover problem was very real. It's interesting that now we call it the Y2K "scare" and since most things came through that okay we often suppose that the people who were warning about this impending problem were simply being alarmist and prophets of doom. We often deride them. But the fact is, people did take these prophets of doom seriously and there was a massive, even heroic effort, to fix a lot of these critical backend systems so that disaster was avoided (just barely). I'm not talking about PCs rolling over to 00. I'm talking about banking software, mission critical control software. It certainly was scary enough for a lot of companies to spend a lot of money rewriting key software. The problem wasn't with COBOL necessarily. In the end disaster was averted (rather narrowly) thanks to the hard work of a lot of people, and thanks to the few people who were vocal in warning of the impending issue. That said, I'm not sure Python 2.7's impending EOL is comparable to the Y2K crisis. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Fri, Jan 18, 2019 at 10:43 AM Michael Torrie wrote: > > On 01/16/2019 12:02 PM, Avi Gross wrote: > > I recall the days before the year 2000 with the Y2K scare when people > > worried that legacy software might stop working or do horrible things once > > the clock turned. It may even have been scary enough for some companies to > > rewrite key applications and even switch from languages like COBOL. > > Of course it wasn't just a scare. The date rollover problem was very > real. It's interesting that now we call it the Y2K "scare" and since > most things came through that okay we often suppose that the people who > were warning about this impending problem were simply being alarmist and > prophets of doom. We often deride them. But the fact is, people did > take these prophets of doom seriously and there was a massive, even > heroic effort, to fix a lot of these critical backend systems so that > disaster was avoided (just barely). I'm not talking about PCs rolling > over to 00. I'm talking about banking software, mission critical > control software. It certainly was scary enough for a lot of companies > to spend a lot of money rewriting key software. The problem wasn't with > COBOL necessarily. I had one client, a hedge fund, that I fixed literally 1000's of Y2K issues for. When Y2K came and there were no problems, the owner said to me "You made such a big deal about the Y2K thing, and nothing happened." -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 01/16/2019 12:02 PM, Avi Gross wrote: > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things once > the clock turned. It may even have been scary enough for some companies to > rewrite key applications and even switch from languages like COBOL. Of course it wasn't just a scare. The date rollover problem was very real. It's interesting that now we call it the Y2K "scare" and since most things came through that okay we often suppose that the people who were warning about this impending problem were simply being alarmist and prophets of doom. We often deride them. But the fact is, people did take these prophets of doom seriously and there was a massive, even heroic effort, to fix a lot of these critical backend systems so that disaster was avoided (just barely). I'm not talking about PCs rolling over to 00. I'm talking about banking software, mission critical control software. It certainly was scary enough for a lot of companies to spend a lot of money rewriting key software. The problem wasn't with COBOL necessarily. In the end disaster was averted (rather narrowly) thanks to the hard work of a lot of people, and thanks to the few people who were vocal in warning of the impending issue. That said, I'm not sure Python 2.7's impending EOL is comparable to the Y2K crisis. -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Which brings up the assumption that this whole A.D. thing is gonna stick around for more than a few millennia and isn't just a fad. Sloppy to just use positive for A.D. and negative for B.C. without a discrete unit for Age. What happens when Sauron is defeated and the Third Age is declared? Or if we go back to "in the 65th year of the reign of Elizabeth, second of her name"? Or if someone declares a new epoch and that this is really the year 49 A.U. (Anno Unixy)? -Original Message- From: Python-list [mailto:python-list-bounces+david.raymond=tomtom@python.org] On Behalf Of Avi Gross Sent: Thursday, January 17, 2019 5:46 PM To: python-list@python.org Subject: RE: Pythonic Y2K Ian, You just scared me. It is 2019 which has four digits. In less than 8,000 years we will need to take the fifth to make numbers from 10,000 to 10,999. 90,000 years later we will need a sixth digit and so on. Do you know how many potential Y2K-like anomalies we may have between now and year 292,277,026,596 when it may all be over? Will people evert learn and just set aside lots of room that dates can grow into or allow varying lengths? Makes me wonder though why anyone in the distant future would want numbers that long to represent that date. I suspect that long before then, some surviving members of whatever the human race becomes will do a reset to a new calendar such as the date the first settlers arrived in the Gamma Quadrant. So whatever unit they store time in, though, may still need a way to reach back to historic times just as we do by talking about what may have happened in 2000 B.C. -Original Message- From: Python-list On Behalf Of Ian Kelly Sent: Thursday, January 17, 2019 2:14 PM To: Python Subject: Re: Pythonic Y2K On Wed, Jan 16, 2019 at 9:57 PM Avi Gross wrote: > > The forthcoming UNIX 2038 problem will, paradoxically happen on > January 19. I wonder what they will do long before then. Will they just add a byte or four or 256 and then make a date measurable in picoseconds? Or will they start using a number format that can easily deal with 1 Million B.C. and 5 Billion A.D. just in case we escape earth before it co-locates with the expanding sun. The obvious solution is to stop using 32-bit Unix timestamps and start using 64-bit Unix timestamps. This change has already been made in some OSes, and the problem will not repeat until the year 292,277,026,596, by which time it is highly unlikely that either Unix timestamps or humankind itself will still exist. Even if they will, that moment in time is so far out from the present that I can't really be bothered by the possibility. We have 19 years to take care of the problem before it happens. Hopefully this time around we won't be trying to fix it right up until the last minute. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
Back in the computer world, Y2K gave such managers some cover. There was a FIRM deadline. I wonder how many used the impending arrival of the year 2000 as an excuse to perhaps clean up other parts of their act and charge it to prevention. I mean they might suggest they rewrite some legacy COBOL or even machine language programs into something more modern or other improvements like getting a new database including new hardware. Of course we did! However, as pointed-out elsewhere, sometimes the costs of re-writing seemed less than those required to ameliorate any number of unknown issues in the legacy code. Remembering that we would also remove unneeded cruft, and (usually) add features needed for 'today's use'. What were you saying about politicians 'playing' with retirement funds and public money? The other advantage to a re-write decision was even more under-hand: once agreed, that became a dev project (with a 31Dec1999 drop-dead deadline) and was NO LONGER part of the Y2K project, ie no longer 'my problem'! I recall at least one project where the users over-egged their case (IMHO), taking the dev option even against my advice. They failed to make the deadline. Let's just say, on their part, there was a lot of fancy-footwork during the first days of 2000... I also wonder if jobs for some programmers declined sharply in the years after when not only were they not desperately needed, but perhaps not needed at all unless they developed new talents. No, quite the opposite. What happened was that many other projects were put-off pending Y2K amelioration. Once we could release staff, they were greeted with open arms, and often far, far, greater appreciation than normally meets a new dev upon arrival. Just FYI, the name Y2K always struck me as similar nonsense. They abbreviated Year and 2000 from at least 8 characters to 3 and did it wrong as 2K is 2048. As far as I know, nothing special will happen in 2048 and I also have no special vision for 2020. You don't seem to understand journalism: Never let the truth (facts) interfere with a 'good story'! I was just talking with a (tech) librarian, who had asked me about "the Unix Millennium bug" a few weeks ago, and mentioned this thread. He groaned, wondering how long it will be before some hack writes a sensationalist book with which to greet the end of the (binary) world... My play-time this afternoon will involve using Python to keep (time-code) track of when/where to superimpose components into a video stream... (I should be back before 2038) -- Regards =dn -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Ian, You just scared me. It is 2019 which has four digits. In less than 8,000 years we will need to take the fifth to make numbers from 10,000 to 10,999. 90,000 years later we will need a sixth digit and so on. Do you know how many potential Y2K-like anomalies we may have between now and year 292,277,026,596 when it may all be over? Will people evert learn and just set aside lots of room that dates can grow into or allow varying lengths? Makes me wonder though why anyone in the distant future would want numbers that long to represent that date. I suspect that long before then, some surviving members of whatever the human race becomes will do a reset to a new calendar such as the date the first settlers arrived in the Gamma Quadrant. So whatever unit they store time in, though, may still need a way to reach back to historic times just as we do by talking about what may have happened in 2000 B.C. -Original Message- From: Python-list On Behalf Of Ian Kelly Sent: Thursday, January 17, 2019 2:14 PM To: Python Subject: Re: Pythonic Y2K On Wed, Jan 16, 2019 at 9:57 PM Avi Gross wrote: > > The forthcoming UNIX 2038 problem will, paradoxically happen on > January 19. I wonder what they will do long before then. Will they just add a byte or four or 256 and then make a date measurable in picoseconds? Or will they start using a number format that can easily deal with 1 Million B.C. and 5 Billion A.D. just in case we escape earth before it co-locates with the expanding sun. The obvious solution is to stop using 32-bit Unix timestamps and start using 64-bit Unix timestamps. This change has already been made in some OSes, and the problem will not repeat until the year 292,277,026,596, by which time it is highly unlikely that either Unix timestamps or humankind itself will still exist. Even if they will, that moment in time is so far out from the present that I can't really be bothered by the possibility. We have 19 years to take care of the problem before it happens. Hopefully this time around we won't be trying to fix it right up until the last minute. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Well said, Joseph. Unfortunately, many companies are run these days with a view toward the IMMEDIATE bottom line. I mean numbers like revenues or expenses are seen short-term. If a project stops development on new things and spends time redoing old things, there are expenses recorded with no revenues. Another organization might go ahead and get up to date and a few years down the road their projects are sailing along while the first employer keeps running into obstacles and is not able to get new developments and so on. But often, the manager making the decision will have taken their bonus or promotion and perhaps moved on or retired. Not to be political, I note many government entities, especially including the state I live in, have gigantic pension obligations they have no easy way to meet. Over the years in negotiations with unions they have often traded rich promises for the future instead of immediate pay hikes. The costs may largely be invisible and did not impact their current budgets so they could waste money on other things such as giveaways that help them get re-elected to ever higher office. But the costs are getting very visible now, and especially when the stocks they invest in decline. Back in the computer world, Y2K gave such managers some cover. There was a FIRM deadline. I wonder how many used the impending arrival of the year 2000 as an excuse to perhaps clean up other parts of their act and charge it to prevention. I mean they might suggest they rewrite some legacy COBOL or even machine language programs into something more modern or other improvements like getting a new database including new hardware. I also wonder if jobs for some programmers declined sharply in the years after when not only were they not desperately needed, but perhaps not needed at all unless they developed new talents. Just FYI, the name Y2K always struck me as similar nonsense. They abbreviated Year and 2000 from at least 8 characters to 3 and did it wrong as 2K is 2048. As far as I know, nothing special will happen in 2048 and I also have no special vision for 2020. -Original Message- From: Python-list On Behalf Of Schachner, Joseph Sent: Thursday, January 17, 2019 1:46 PM To: Python Subject: RE: Pythonic Y2K I'd like to add one more thing to your list of what companies will have to consider: 6) The ability to hire and retain employees who will be happy to program in an obsolete version of Python. A version about which new books will probably not be written. A version which new packages will not support. A version which most other companies will no longer be using, so programming only in Python 2 will place the employee at a disadvantage compared to others who have gained experience with Python 3 if they ever have to change employers. --- Joseph S. -Original Message- From: Chris Angelico Sent: Wednesday, January 16, 2019 2:15 PM To: Python Subject: Re: Pythonic Y2K On Thu, Jan 17, 2019 at 6:04 AM Avi Gross wrote: > > I see messages like the following where someone is still asking how to > do something in some version of python 2.X. > > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things > once the clock turned. It may even have been scary enough for some > companies to rewrite key applications and even switch from languages like COBOL. > > What is happening in the python community, and especially in places > where broken software may be a serious problem? > > I assume versions of python 2.X will continue to be available for some > time but without any further support and without features being back-ported. Commercial support for Python 2 will probably continue for a while, in the same way that support for versions older than 2.7 is still available to Red Hat customers today (if I'm not mistaken). Otherwise, well, the software will continue without updates or security patches until it breaks. Companies will have to weigh up five costs against each other: 1) The cost of the status quo: the risk of critical failures or external attacks against unsupported and unpatched software 2) The cost of migrating to Python 3 3) The cost of migrating to a completely different language 4) The cost of maintaining their own local fork of Python 2 5) The cost of using a supported commercial platform such as RHEL. For most small to medium projects, it's probably going to come down to #1 or #2, where #1 has the laziness bonus. For many larger companies, #1 is an unpayable cost. Everyone has to make that choice, and remember that "cost" doesn't just mean money (for instance, the cost of moving to Linux might be quite considerable for a Windows shop, and even within a Linux ecosystem, switching to Red Hat may have consequences to other programs you might need). ChrisA -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Fri, Jan 18, 2019 at 8:47 AM DL Neil wrote: > > On 17/01/19 6:53 PM, Chris Angelico wrote: > > On Thu, Jan 17, 2019 at 3:55 PM Avi Gross wrote: > >> The forthcoming UNIX 2038 problem will, paradoxically happen on January 19. > >> > > > > Paradoxically? What do you mean by that? > > > First we had to duck the Y2K problem. > By moving everything to 64-bits, we duck the Unix Millennium problem. > > There you go: two ducks - a pair-o-ducks! Well, I'm sorry Neil, but these things are well documented. In fact, you can find information on the web, OR you can examine the man pages in your local installation. Wait, we're right back where we started... a pair-o-docs > I assume the paradox involves noting that the end of the (32-bit) Unix > epoch, does not coincide with the end of a (Gregorian) calendar year. Ahh yes, as paradoxical as when the Mayan Y2K happened in December of 2012. Gotcha. > Actually, aren't there three date-time problems to be ducked? Wasn't > Python's move to having wider timestamps (+= fractions of seconds) in > part connected with the need to modify NTP - which hits the wall a few > years before 2038... Oh, probably. But one of the reasons I use high level languages is so I don't have to worry about the sizes of integers. In fact, some day, we won't use floats to store seconds, we'll just use bignum integers to store some number of Planck times ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 17/01/19 6:53 PM, Chris Angelico wrote: On Thu, Jan 17, 2019 at 3:55 PM Avi Gross wrote: The forthcoming UNIX 2038 problem will, paradoxically happen on January 19. Paradoxically? What do you mean by that? First we had to duck the Y2K problem. By moving everything to 64-bits, we duck the Unix Millennium problem. There you go: two ducks - a pair-o-ducks! I assume the paradox involves noting that the end of the (32-bit) Unix epoch, does not coincide with the end of a (Gregorian) calendar year. Actually, aren't there three date-time problems to be ducked? Wasn't Python's move to having wider timestamps (+= fractions of seconds) in part connected with the need to modify NTP - which hits the wall a few years before 2038... -- Regards =dn -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 2019-01-17, Schachner, Joseph wrote: > I'd like to add one more thing to your list of what companies will have to > consider: > > 6) The ability to hire and retain employees who will be happy to >program in an obsolete version of Python. A version about which >new books will probably not be written. A version which new >packages will not support. A version which most other companies >will no longer be using, so programming only in Python 2 will >place the employee at a disadvantage compared to others who have >gained experience with Python 3 if they ever have to change >employers. IMO, that's a non-issue. AFAICT, a pretty large percentage of SW developers are using obsolete or proprietary tools that aren't cool and fashionable, don't have outside support, about which books aren't being written, and for which third-party packages and libraries don't exist. I also don't think an experience Python2 developer would be turned down for a position on a project that's using Python3. -- Grant Edwards grant.b.edwardsYow! Do you guys know we at just passed thru a BLACK gmail.comHOLE in space? -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Wed, Jan 16, 2019 at 9:57 PM Avi Gross wrote: > > The forthcoming UNIX 2038 problem will, paradoxically happen on January 19. I wonder what they will do long before then. Will they just add a byte or four or 256 and then make a date measurable in picoseconds? Or will they start using a number format that can easily deal with 1 Million B.C. and 5 Billion A.D. just in case we escape earth before it co-locates with the expanding sun. The obvious solution is to stop using 32-bit Unix timestamps and start using 64-bit Unix timestamps. This change has already been made in some OSes, and the problem will not repeat until the year 292,277,026,596, by which time it is highly unlikely that either Unix timestamps or humankind itself will still exist. Even if they will, that moment in time is so far out from the present that I can't really be bothered by the possibility. We have 19 years to take care of the problem before it happens. Hopefully this time around we won't be trying to fix it right up until the last minute. -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Fri, Jan 18, 2019 at 5:48 AM Schachner, Joseph wrote: > > I'd like to add one more thing to your list of what companies will have to > consider: > > 6) The ability to hire and retain employees who will be happy to program in > an obsolete version of Python. A version about which new books will probably > not be written. A version which new packages will not support. A version > which most other companies will no longer be using, so programming only in > Python 2 will place the employee at a disadvantage compared to others who > have gained experience with Python 3 if they ever have to change employers. > The costs I described were _alternatives_, so what you've really described here is one component of the costs of status quo, maintaining a local fork, or using commercial Py2 support (options 1, 4, and 5). It is definitely a cost, though, and may at some point become the impetus for a company to migrate to Py3... but frankly, I doubt it. "Legacy code" is a pretty big thing in a lot of places. Just search thedailywtf.com for "legacy" and you'll find plenty of stories of terrible codebases held together by duct tape and blue tack, run through a hodge-podge of different language interpreters, or even just one interpreter but the wrong one... https://thedailywtf.com/articles/VB_0x2b__0x2b_ By comparison to THAT, a "legacy system" that runs a clean Python 2.7 is going to be pretty tame. Until someone says, oh hey, we should use function annotations, and then starts running all their code (in production) via 2to3... naw, that'll never happen, right? Oh. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
I'd like to add one more thing to your list of what companies will have to consider: 6) The ability to hire and retain employees who will be happy to program in an obsolete version of Python. A version about which new books will probably not be written. A version which new packages will not support. A version which most other companies will no longer be using, so programming only in Python 2 will place the employee at a disadvantage compared to others who have gained experience with Python 3 if they ever have to change employers. --- Joseph S. -Original Message- From: Chris Angelico Sent: Wednesday, January 16, 2019 2:15 PM To: Python Subject: Re: Pythonic Y2K On Thu, Jan 17, 2019 at 6:04 AM Avi Gross wrote: > > I see messages like the following where someone is still asking how to > do something in some version of python 2.X. > > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things > once the clock turned. It may even have been scary enough for some > companies to rewrite key applications and even switch from languages like > COBOL. > > What is happening in the python community, and especially in places > where broken software may be a serious problem? > > I assume versions of python 2.X will continue to be available for some > time but without any further support and without features being back-ported. Commercial support for Python 2 will probably continue for a while, in the same way that support for versions older than 2.7 is still available to Red Hat customers today (if I'm not mistaken). Otherwise, well, the software will continue without updates or security patches until it breaks. Companies will have to weigh up five costs against each other: 1) The cost of the status quo: the risk of critical failures or external attacks against unsupported and unpatched software 2) The cost of migrating to Python 3 3) The cost of migrating to a completely different language 4) The cost of maintaining their own local fork of Python 2 5) The cost of using a supported commercial platform such as RHEL. For most small to medium projects, it's probably going to come down to #1 or #2, where #1 has the laziness bonus. For many larger companies, #1 is an unpayable cost. Everyone has to make that choice, and remember that "cost" doesn't just mean money (for instance, the cost of moving to Linux might be quite considerable for a Windows shop, and even within a Linux ecosystem, switching to Red Hat may have consequences to other programs you might need). ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Thu, Jan 17, 2019 at 3:55 PM Avi Gross wrote: > The forthcoming UNIX 2038 problem will, paradoxically happen on January 19. > Paradoxically? What do you mean by that? ChrisA -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Dave, You have me worried now. Yes, years from now you may need experts who can handle not just 2.X but specific versions like 2.4. Assuming python keeps making incompatible changes and is up to version 9.02, you may also have 3.X experts and 4.X experts and so on. Of course, by then, some of the experts may be an AI specializing ... All kidding aside, I have to wonder if future developments may result in new categories of computer languages that are designed anew in radically different ways to the point where it may get more people to switch and more languages to halt further development. I see Unicode as a potential driver. The number of symbols in languages like python is fixed by what can be seen on a normal keyboard. That keyboard needs to change a bit, or some virtual version, to support lots more. When that happens, we won't be forced to do as much sharing and overloading as we do now. How many ways is "%" used including within one line of code? >>> print("%d %s " % (9 % 5, "ways to use %")) 4 ways to use % Similarly, {} can be used for dictionaries or sets but an empty set initializes a dictionary only. [] can be used to index lists by number or dictionaries by key. There are not as many such sets of characters available and <> is reserved for other uses and parentheses also has an odd role with tuples as well as keeping things in some order of operations. Imagine adding a few more matched symbols including some you can define for your own newly created kinds of data like matrices. Similarly, you could have an abbreviated way of defining additional operations if you could just use come common mathematical symbols that are not in ASCII, not to mention some dingbats. If a programming language leaped across the ASCII divide (and I am sure some have, including the languages that used backspace to make overwritten multi-character operators) I can see ways to make more compact but less confusing languages. I admit that might confuse some people, especially some that only really know one language. I am used to multiple languages including some with rather unique character sets and perhaps may be the only one willing to use such a language. OK, sort of kidding. I have seen many forums like this one (and not just about computer languages) where I encounter true believers that do not welcome any suggestion that there may be other things out there with some merit or that their own may change. I welcome change and am interested in different ways of thinking. This makes it harder for me to quite see the viewpoint that I associate with stasis. But, to each their own. Perhaps literally. -Original Message- From: Python-list On Behalf Of DL Neil Sent: Wednesday, January 16, 2019 11:04 PM To: Python Subject: Re: Pythonic Y2K On 17/01/19 4:45 PM, Larry Martell wrote: > On Wed, Jan 16, 2019 at 9:35 PM Avi Gross wrote: >> >> Chris, >> >> The comparison to Y2K was not a great one. I am not sure what people >> did in advance, but all it took was to set the clock forward on a >> test system and look for anomalies. Not everything would be found but it gave some hints. > > Clearly you did not live through that. I did and I got over 2 years of > real work from it. Companies hired me to check their code and find > their Y2K exposures. Things like a hard coded '19' being added to a 2 > digit year. Or code that only allocated 2 bytes for the year. I could > go on and on. At one client I had I found over 4,000 places in their > code that needed to be modified. And there was no widespread use of > VMs that you could easily and quickly spin up for testing. It was a > real problem but because of many people like me, it was dealt with. > Now the next thing to deal with is the Jan. 19, 2038 problem. I'll be > 80 then, but probably still writing code. Call me if you need me. Same. The easy part was finding the hardware, configuring identical systems, and changing the date-era. Remember that we pre-dated TDD, so we pretty much re-designed entire testing suites! The most difficult work was with the oldest systems - for which there was no/little/worthless documentation, and usually no dev staff with 'memory'. Then there were the faults in OpSys and systems programs on which we could supposedly rely - I still have a couple of certificates somewhere, for diagnosing faults which MSFT had not found... The difficulty of multi-layer fault-finding is an order of magnitude more difficult than Python debugging alone! I'm told there are fewer and fewer COBOL programmers around, and those that survive can command higher rates as a consequence. Would going 'back' to that be regarded as "up" skilling? Does this imply that there might one day be a premium chargeable by Py2.n coders? -- Regards =dn -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
that it fills multiple pages. I would not show most beginners too much of python at first, or just ask them to take some things on faith for a while. A language that initially claimed to be designed to do things pretty much ONE way has miserably failed as I can do almost anything a dozen ways. That is NOT a bad thing. As long as the goal is for someone to learn how to do something in any one way, it is great. If you want them to be able to read existing code and modify it, it can be a headache especially when people abuse language features. And yes, I am an abuser in that sense. -Original Message- From: Python-list On Behalf Of Larry Martell Sent: Wednesday, January 16, 2019 10:46 PM To: Python Subject: Re: Pythonic Y2K On Wed, Jan 16, 2019 at 9:35 PM Avi Gross wrote: > > Chris, > > The comparison to Y2K was not a great one. I am not sure what people > did in advance, but all it took was to set the clock forward on a test > system and look for anomalies. Not everything would be found but it gave some > hints. Clearly you did not live through that. I did and I got over 2 years of real work from it. Companies hired me to check their code and find their Y2K exposures. Things like a hard coded '19' being added to a 2 digit year. Or code that only allocated 2 bytes for the year. I could go on and on. At one client I had I found over 4,000 places in their code that needed to be modified. And there was no widespread use of VMs that you could easily and quickly spin up for testing. It was a real problem but because of many people like me, it was dealt with. Now the next thing to deal with is the Jan. 19, 2038 problem. I'll be 80 then, but probably still writing code. Call me if you need me. -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On 17/01/19 4:45 PM, Larry Martell wrote: On Wed, Jan 16, 2019 at 9:35 PM Avi Gross wrote: Chris, The comparison to Y2K was not a great one. I am not sure what people did in advance, but all it took was to set the clock forward on a test system and look for anomalies. Not everything would be found but it gave some hints. Clearly you did not live through that. I did and I got over 2 years of real work from it. Companies hired me to check their code and find their Y2K exposures. Things like a hard coded '19' being added to a 2 digit year. Or code that only allocated 2 bytes for the year. I could go on and on. At one client I had I found over 4,000 places in their code that needed to be modified. And there was no widespread use of VMs that you could easily and quickly spin up for testing. It was a real problem but because of many people like me, it was dealt with. Now the next thing to deal with is the Jan. 19, 2038 problem. I'll be 80 then, but probably still writing code. Call me if you need me. Same. The easy part was finding the hardware, configuring identical systems, and changing the date-era. Remember that we pre-dated TDD, so we pretty much re-designed entire testing suites! The most difficult work was with the oldest systems - for which there was no/little/worthless documentation, and usually no dev staff with 'memory'. Then there were the faults in OpSys and systems programs on which we could supposedly rely - I still have a couple of certificates somewhere, for diagnosing faults which MSFT had not found... The difficulty of multi-layer fault-finding is an order of magnitude more difficult than Python debugging alone! I'm told there are fewer and fewer COBOL programmers around, and those that survive can command higher rates as a consequence. Would going 'back' to that be regarded as "up" skilling? Does this imply that there might one day be a premium chargeable by Py2.n coders? -- Regards =dn -- https://mail.python.org/mailman/listinfo/python-list
Re: Pythonic Y2K
On Wed, Jan 16, 2019 at 9:35 PM Avi Gross wrote: > > Chris, > > The comparison to Y2K was not a great one. I am not sure what people did in > advance, but all it took was to set the clock forward on a test system and > look for anomalies. Not everything would be found but it gave some hints. Clearly you did not live through that. I did and I got over 2 years of real work from it. Companies hired me to check their code and find their Y2K exposures. Things like a hard coded '19' being added to a 2 digit year. Or code that only allocated 2 bytes for the year. I could go on and on. At one client I had I found over 4,000 places in their code that needed to be modified. And there was no widespread use of VMs that you could easily and quickly spin up for testing. It was a real problem but because of many people like me, it was dealt with. Now the next thing to deal with is the Jan. 19, 2038 problem. I'll be 80 then, but probably still writing code. Call me if you need me. -- https://mail.python.org/mailman/listinfo/python-list
RE: Pythonic Y2K
Chris, The comparison to Y2K was not a great one. I am not sure what people did in advance, but all it took was to set the clock forward on a test system and look for anomalies. Not everything would be found but it gave some hints. Similarly, it is trivial today to take a machine and install only the new python version and try it, albeit many programs may have effects far away across the internet in some apps and harder to isolate. But Y2K was going to happen guaranteed. The split between 2.X has been fairly slow in some ways and with crutches designed to make a transition smoother. As long as there are people willing to remain behind in what they consider safe territory, there will be those happy to sell them some things and probably raise their prices. I recall what happened with TELEX some years ago. It was awfully slow compared to other methods such as email or FAX but it had legal standing and clients who had to be reached in places that had less access and so on. I recall how despite huge drops in message volume year after year, it remained a very profitable item for my AT unit and continued to be supported as we just kept raising the price for those still wanting to use it. I was working on newer stuff and just watched several such parts in amazement including others using what I considered obsolete protocols. Not obsolete to those customers though. I empathize with people using software that already works. You know, if it isn't broken, But to continue making it work may require supplying complete solutions as customers cannot be expected to have older python interpreters sitting around on their machine. If your application resides within your servers and only communicates with others of the same ilk, you probably can continue using it indefinitely. There may be someone still using an ancient PC from the 80's running Windows 1.0 with a similarly old copy of WordPerfect and a serial printer attached. But if they want to load a recent version of Microsoft Office, forget it. Heck, it would not fit on their 20MEG hard disk. I assume there must be tools out there that can look over your code and point out places where it may not be compatible. But the more I learn, the more I realize how subtle a problem may be. Some of the back-ported features for example are not exactly identical in effect. Close, maybe. Things like changes in Unicode functionality often are papered over and then a valid statement in one may print a character while in the other it may print a short int. If you ran such a program and it showed minimal issues like just needing to change print statements into print functions, it might be easier to convince your boss to upgrade. If the effort looks massive, they may even opt for a complete rewrite or move to another language that is less likely to keep changing, or maybe just buy a solution that is not quite tailored to their needs. I did have an experience along these lines years ago when a new variation of C came across. Years ago I worked on a project at Bell Labs that was mostly in C. As C++ was coming out, I volunteered to do my part using it as I could show why my application would be object-oriented. I was shocked when they approved and I set up make files that recognized what code was in C or C++ and compiled and linked them properly and so on. It worked beautifully and guess what? We tossed the entire project! This was back in the days when AT cared less about wasting money. Our project had been canceled at higher levels and the management spent half a year deciding what our next project would be but felt no need to tell us! So when I offered to do something experimental, they figured why not! And, yes, eventually we did our new development in C++. But had they considered it, it would have made more sense to stop developing something and take the time to retrain the staff with courses that were available and have us use up some vacation days and be ready for a new project. Bottom line here is I should not be surprised if some people want an answer on how to keep projects in 2.X going. But, unless there is a reason, I see little reason in going along or teaching new programmers on what is in a sense the less useful version going forward. The main reason to study 2.X, FOR ME, is to be able to understand it if I encounter it and perhaps be able to rewrite it. -Original Message- From: Python-list On Behalf Of Chris Angelico Sent: Wednesday, January 16, 2019 2:15 PM To: Python Subject: Re: Pythonic Y2K On Thu, Jan 17, 2019 at 6:04 AM Avi Gross wrote: > > I see messages like the following where someone is still asking how to > do something in some version of python 2.X. > > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things > once the clock turned. It may even have been scary enough for some > companies to rewrite key applications and ev
Re: Pythonic Y2K
On Thu, Jan 17, 2019 at 6:04 AM Avi Gross wrote: > > I see messages like the following where someone is still asking how to do > something in some version of python 2.X. > > I recall the days before the year 2000 with the Y2K scare when people > worried that legacy software might stop working or do horrible things once > the clock turned. It may even have been scary enough for some companies to > rewrite key applications and even switch from languages like COBOL. > > What is happening in the python community, and especially in places where > broken software may be a serious problem? > > I assume versions of python 2.X will continue to be available for some time > but without any further support and without features being back-ported. Commercial support for Python 2 will probably continue for a while, in the same way that support for versions older than 2.7 is still available to Red Hat customers today (if I'm not mistaken). Otherwise, well, the software will continue without updates or security patches until it breaks. Companies will have to weigh up five costs against each other: 1) The cost of the status quo: the risk of critical failures or external attacks against unsupported and unpatched software 2) The cost of migrating to Python 3 3) The cost of migrating to a completely different language 4) The cost of maintaining their own local fork of Python 2 5) The cost of using a supported commercial platform such as RHEL. For most small to medium projects, it's probably going to come down to #1 or #2, where #1 has the laziness bonus. For many larger companies, #1 is an unpayable cost. Everyone has to make that choice, and remember that "cost" doesn't just mean money (for instance, the cost of moving to Linux might be quite considerable for a Windows shop, and even within a Linux ecosystem, switching to Red Hat may have consequences to other programs you might need). ChrisA -- https://mail.python.org/mailman/listinfo/python-list