On Thu, Feb 20, 2014 at 2:32 PM, Ken Dibble <[email protected]> wrote:
> > What is software, what is a problem? That has to be defined first. > Oh, no, that leads us down the path to "What does "is" mean?" Start your own thread! :) > > Some kinds of software (and hardware) are as ephemeral > > Other kinds of software (and hardware) are as basic, and enduring > Like order processing, inventory control, line-of-business applications. Agreed. > The overriding problem is that it is in the profitable interest of > computer hardware and software producers to try to convince everybody that > these fundamental differences do not exist, and that everybody has to buy > new stuff every 12-to-18 months for everything. > Which vendors are that? Most Apple users I know keep their machines for 5 years or more. Android is a hot mess. I picked up a Lenovo K1 tablet and no sooner had I plunked my money down than the vendor announced no new upgrades. And 3rd-party "jailbroken" ROM firmware is catch-as-catch-can, vaguely supported and possibly sketchy. Once burned, twice shy. Perhaps you need to find new vendors. Ubuntu and Red Hat make long-term versions with stable support for 3 and 5 years, iirc. Perl, Python, PHP and Ruby have much longer version ramp-up times. Linux on the desktop is a not a pretty story, but you can get by. Perhaps you need to find a vendor whose values are more aligned with yours. And here, with all due respect, we veer off bad vendor practices and jump into computer > Why isn't a 4-bit int a 4-bit int no matter how much bandwidth the OS > offers? Yeah, it is. Zero to fifteen or -7 to +8, depending on whether it's signed or unsigned. > Why shouldn't software that uses 4-bit ints work just as well on an 8-bit, > 16-bit, 32-bit, or 64-bit OS, without any messing around with "adapters" or > "VMs" or anything else? > Apps written to interact with the underlying OS via a 16-bit interface need to have that interface supported. YOUR choice of vendor made Win32s a work-around for a short period, but it is not in THEIR business interest to have you run old apps. Other vendors have compilers that support a lot of older software. However, when the vendor changes platforms (as Apple did in the PowerPC to Intel migration), all bets are off. Why shouldn't OS designers assume that 20 years from now everybody will > want a 512-bit-wide pipe, and build it into the OS RIGHT NOW, so people > don't have to "upgrade" when the 512-bit software arrives? > There are several good reasons. First, there are lots of visions of the future, but few of them come true, or we'd have flying cards and food replicators by now (I was really counting on that one, "Earl Grey, hot.") Second, the whole arbitrary 8-bit, 16, 32, 64, 128 bit thing is based on real physical limitations of the current crops of chips. Increasing these means a lot of cost in hardware manufacture as well as software. It's easier to amortize that cost over time than make the leap all at once. Making current hardware work with 512-bitness would mean, roughly, making current applications 8 times slower, larger and less efficient, for no apparent gain. It just doesn't make business sense. Third, it's not clear that "bitness" is going to continue to climb, as there are some strong pressures to go in other directions. The Internet of Things make it likely that we are going to be integrating more ultra-low-power sensors, like FitBits and Bluetooth devices and Google Glass and Nest thermostats, in small device orchestration. Databases can already manage petabytes with 64-bits, and that's too much information for me, thankyouverymuch. Finally, there is always going to be change, and change inevitably leads to frictions and incompatibilities. There will always be a need for us technies who can glom together disparate parts to make a whole. Recently, I had to update MY choice of OS/distros (Fedora 19) to run Ruby 2.0 for a project. The distro's supplied Ruby is older than that, so I needed to build a copy from source. (Yay, Open Source!) Well, the source would not compile on my machine. Ruby has some dependencies in crypto libraries, and Fedora disabled one of the elliptical curve algorithms due to a potential weakness. The Ruby library made the naive assumption that library would always be available, even though it really didn't need to hard code it. A patch is in the works but not yet released to verify the count of algorithms is one or more, rather than search for one specific algorithm. So, I had to research the issue, find a solution (there was a patch), manually apply that patch, rebuild the software to create my very own custom Ruby. So, to summarize a much TL;DR response: Some vendors are just in it for the bucks and don't care about customer satisfaction; choose the vendors who support your values; Open Source wins, at a cost. -- Ted Roche Ted Roche & Associates, LLC http://www.tedroche.com --- StripMime Report -- processed MIME parts --- multipart/alternative text/plain (text body -- kept) text/html --- _______________________________________________ Post Messages to: [email protected] Subscription Maintenance: http://mail.leafe.com/mailman/listinfo/profox OT-free version of this list: http://mail.leafe.com/mailman/listinfo/profoxtech Searchable Archive: http://leafe.com/archives/search/profox This message: http://leafe.com/archives/byMID/profox/cacw6n4vkgovjjyvm4fp6v-tertyjqhjkd2bxg+lpdck1zbb...@mail.gmail.com ** All postings, unless explicitly stated otherwise, are the opinions of the author, and do not constitute legal or medical advice. This statement is added to the messages for those lawyers who are too stupid to see the obvious.

