More in Nicolas Carr from my favorite pundit.
Carr-ied away: http://www.issurvivor.com/ArticlesDetail.asp?ID=651
Carr-toonish engineering:
http://www.issurvivor.com/ArticlesDetail.asp?ID=652
There is also some truth in these articles, and they should be carefully
considered. One problem is that it appears that everyone is applying
whatever interpreation they like to the term, "IT", and then using it to
support their particular point of view.
The author criticizes Carr using absurdity " Every business has access to
the same everything as every other business -- the same technology, ideas,
people, processes, capital, real estate ..."
This is patently false, since neither the ideas, people, capital, or real
estate are commodities in any sense implying equality. Processes may or may
not be identical, which doesn't really say much, but to trivialize this
argument by suggesting that all these elements are on an equal par with
technology for comparison is seriously disingenious. By the author's own
admission " Most internal IT organizations long ago changed their focus.
They seldom develop. Mostly they configure and integrate purchased
applications. " What is this if it isn't turning applications into
commodities?
Don't get me wrong. I'm not here to defend nor support Nicholas Carr. What
I am saying is that to dismiss some of these points out of hand is also
wrong, and bears some scrutiny in assessing what is occurring within IT.
(Disclaimer: I am not a supporter of Nicholas Carr, nor am I familiar with
his writings beyond those stated in these posts).
Consider this:
In the early years (decades) of computing, there was a strong incentive for
companies to develop in-house applications because of the competitive
advantage this could provide. An idea could be developed and implemented
that might completely blind-side a competitor and provide a significant
business advantage.
Increasingly, this is no longer the case and we have seen a decline in the
need for large development staffs with a significant portion of software
being purchased from outside providers. In other words, many applications
have become commodities that no longer convey advantage.
Therefore to determine which direction Applications Development is taking
within the term, "IT", compare how many new systems and/or applications are
being created in-house versus those that are purchased "off the shelf".
We could also consider what's happening with IT Operations, and it should be
abundantly clear that there is a higher degree of automation and system
tools which have been brought to bear, so this area has also shrunk to
"commodity" levels. In other words, in today's environment the operator
also requires less expertise.
In systems programming, we have also seen greater consolidation of hardware
resources and more software tools being made available to gain economies of
scale. Because of these changes, fewer people can support larger
configurations. The responsibilities have also become more specialized with
the vendors providing a greater role in supporting systems than in previous
decades. Systems programmers (in many organizations) have become largely
supplanted by systems administrators.
In all these cases, the argument can be made that IT has evolved to be
functional with fewer individuals and less expertise (on hand, on a daily
basis). This doesn't mean that the expertise isn't required, but rather
that it doesn't have to be on staff as a permanent position. This is one
reason for the rise of outside service providers.
Similarly, even though many applications components are commodities, many
other elements are also assumed, so resources need to be expended to live up
to the expectation. For example, in the past while response time was useful
to improve productivity, etc. In today's "commodity" environment it is an
expectation that the customer has. While it is a "commodity" it is also an
expectation, so that failure to provide expected services becomes a
competitive disadvantage in today's IT world. In the past a database might
have provided advantage by allowing a corporation to access customer data
more quickly than a competitor. In today's environment, the database is
"assumed" and failure to being able to access customer data is a liability.
Without reading too much into it, I would suggest that Nicolas Carr has a
legitimate point when he says that IT can no longer be assumed to carry a
business advantage. In addition, it would appear that it really doesn't
"matter" from a purely technical perspective. However, like all the other
technologies that business relies on, the advantage comes from providing a
high quality level of services for "expected services" and deploying these
"commodities" effectively to enhance the business environment.
It seems like everyone wants a black or white argument, in that IT either
goes away completely, or it remains exactly the same. Neither point is
realistic.
Adam
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html