> -----Original Message----- > From: S. Isaac Dealey [mailto:[EMAIL PROTECTED] > Sent: Tuesday, April 11, 2006 1:11 PM > To: CF-Talk > Subject: Re: VHS and Betamax
>>No not entirely. The issue is that we're still in >>transition. The hardware progress is not as fast >>as many of us would like and sometimes we jump the >>gun with regard to wanting to be able to have the >>Star Trek computer that we just tell what to do and >>it does it. So if I build an application today and >>I fail to optimize it, then my application is going >>to be slow in comparison to another application >>which accomplishes the same task. (Incidentally I >>spend quite a bit of my programming time thinking >>about the optimization of my software -- I may not >>always get it right, but I do have a reasonable >>handle on the concepts.) > With genuine respect: Thanks, although I always assume respect is given. :) Having said that I realize that's an opportunity for someone to poke at me for being disrespectful. :) > I'm not sure that being in transition is relevant > - we're always going to be. More horsepower > generally seems to engender more complex > applications rather than faster ones. While that's true, feature complexity doesn't inherently solve human need. Particularly with software development (as compared to physical product production) there's a sort of illusion of software being cheaper than it is to make because the business people involved get sort of swept away by the lack of material costs. They don't see the development lifecycle and the planning costs the way we do down here in the trenches. That same sort of lack of understanding at the top frequently engenders a lack of research into the usefulness of the end product. I.e. if it costs "nothing" to make, then why not just make it? :) The end result is often a lot of guess-work rather than spending the time and money up-front to research what people need. Thus we end up with all those thousands of nifty little features in MS Office products that less than 1% of the product's target market actually use. :P Now couple that with the evolution of hardware. As the hardware becomes more efficient in todays market, software manufacturers continue to add features because the efficiency of the hardware gives them more room to add them. This is true, and I don't debate that it's still important to add features, since I have plenty of examples of applications currently that even at their most efficient are still performing more slowly than we'd like, or that still lack certain features because those features are simply too inefficient to be practical for daily use with today's hardware. Set up ColdFusion developer edition at home with a copy of any anti-virus application with the auto-file scan enabled. Watch how long it takes the CF Server service to start up. :) Webservices as another example are only viable now because the hardware will support them -- we'd have never even tried webservices if we still had to serve everything on the 14.4k modems we had in the early 90's. So what I'm getting at here is that although we can't see it currently, I believe (and I certainly could be wrong) that the demand for features will eventually taper off (although I'm certain it will never be completely abolished) as the hardware becomes more efficient in the same way that the demand for efficiency in our existing feature sets is tapering off. There is only a certain amount of complexity that will be useful to the average person (avoiding the word "user" here), beyond which any given application begins to delve into a niche market with a much smaller user base, but who as a result of being in that niche market will then also want a certain specific sub-set of features which will vary from the sub-set of features desired by another niche using a similar application. Sure we could just cram both niches into a single application, but it's better for the person using it if they can get something that's not so cavalier about the needs of their niche. There's a good example of this in The Inmates Are Running the Asylum by Alan Cooper where he describes the car that's designed for everyone -- a convertible mini-van with a big bed for hauling lumber. :) I submit to you as a real-world example of that sort of tapering of feature complexity, the common pocket-calculator. :) I know of three essential versions of this thing currently, the basic calculator (performs arithmetic) for people doing their grocery shopping, the scientific calculator (includes a few extra features like the log button) for people in mathematically complex professions such as architects, and the programmable compu-calculator (for mathematicians, trig and calculus students). Note that each of these devices fills a specific niche, and with the possible exception of the most complex of these three niches, the features of these calculators have long since tapered to the point of zero growth. New features aren't added to the basic or scientific calculator anymore because for some years now nobody's discovered any new features for them that would be useful in their niche. > Isaac, I wish it were otherwise, but with multi-core, > multi-threaded processors/processes, I'm not sure > that people CAN optimize software any longer. At > least, at the code level. It just isn't > cost-effective to spend days trying to tweak a > block of code down to 200 cyles from 220 cycles. > (unless you're John Carmack!) That's a good point. :) And you know, in all honesty it hadn't occurred to me that the increased complexity of the hardware is also making it increasingly less likely that someone working with high level languages would be able to draw useful parallels to low level function from their code that might help them get more out of optimization. Beyond increasing the amount of time it would take to discover and understand the parallel, there's really not much that a person can do about things like malloc or low-level string management in a high-level language, and adding things like multi-threading to the equation just makes our ability to control the low-level environment more tenuous. Compound that by the fact that as a programmer we want our applications to run on an array of potential platforms, or at least I do, between *nix and windows versions of ColdFusion and with increasingly variable hardware configurations (as opposed to merely increasingly efficient versions of a more or less singular hardware configuration). It's difficult enough to deal with multiple databases (and multiple drivers per database) and multiple versions of ColdFusion, I shudder to imagine trying to programmatically address the variances between my development server and someone else's 64-processor IBM behemoth. I'm much happier simply assuming that the variance is small enough to not matter when I run my application on it. :) > I can remember the days when it did matter (I wrote > Assembler on a 360/40, circa 1967), but I just don't > believe that that's the case any longer. You may be right. :) At least about low-level optimization. There's still a lot to be said for higher-level optimization. Things like multiple queries vs. join statements, you know, the bigger pieces. :) s. isaac dealey 434.293.6201 new epoch : isn't it time for a change? add features without fixtures with the onTap open source framework http://www.fusiontap.com http://coldfusion.sys-con.com/author/4806Dealey.htm ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237448 Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4 Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4 Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4 Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

