Three Ways Enterprise Software is Changing

Today's IT shops must grapple with analysis, cloud computing and DevOps

By Joab Jackson (IDG News Service) on 20 December, 2014
http://www.arnnet.com.au/article/562900/three-ways-enterprise-software-changing/?fp=2&fpid=1


Once upon a time, life in the enterprise IT shop was fairly simple, at least 
conceptually speaking.

IT issued computers and laptops to employees, and maintained enterprise 
software, databases and servers that supported the company, which were mostly 
run in-house.

These days, IT's basic firmament is giving way to a more breathtaking geography 
that the IT pro must traverse, based on pay-as-you-go cloud computing, building 
applications and performing deep data analysis. Perhaps more fundamentally, IT 
operations are moving from merely supporting the business to driving the 
business itself, which requires agility and making the most of resources.

Here are three of the largest forces at work that will change enterprise 
software in 2015, and beyond:


The Platform

The idea of cloud computing has been around for a while, so it may be hard to 
think of it as a new force. Yet, after a few years of testing the cloud for 
running development projects and tangential applications, enterprises are now 
moving their more critical operations to the cloud.

IDC expects that by 2017 organizations will spend 53.7 percent of their budgets 
on cloud computing, and the market for cloud computing software will be over 
$75 billion.

Concerns about security and overall cost continue to fade as businesses face 
the upgrade costs of replacing data centers full of servers, or stare down the 
large up-front costs of implementing a complex in-house enterprise software 
system.

Travel information provider Lonely Planet is one Web-facing company that has 
made the jump into cloud services, migrating all of its operations to Amazon 
hosted services when its data-center lease came to an end.

"With Amazon, we could treat infrastructure as code," said Darragh Kennedy, 
head of cloud operations for Lonely Planet. Instead of worrying about how many 
servers to lease, the company could concentrate on perfecting its service, with 
Amazon quickly and easily providing however many servers are needed for 
seamless service.

"Our product owners can stand up a new environment in under 10 minutes, and 
that really speeds up how quickly we can build new products," Kennedy said.

Amazon Web Services got a head start in the cloud services space, but Microsoft 
is quickly catching up with its Azure service, according to Gartner.

Other enterprise-focused IT companies quickly ramped up their cloud-computing 
operations this year. IBM and Hewlett-Packard, have each earmarked $1 billion 
to building out their cloud-computing services.

Complicating the best laid cloud migration plans has been the sudden emergence 
of Docker, a new, lighterweight, form of virtualization that promises greater 
portability and faster performance.

Launched in 2013, Docker has been downloaded over 70 million times. The major 
cloud service providers, including Google, IBM and Microsoft, all spun up their 
own, and sometimes proprietary, Docker-based services.

While those CIOs who have already started down the path of cloud computing, 
perhaps by virtualizing some of their operations, may feel frustration at the 
potential of re-gearing with Docker, it provides one key element that they will 
need: swiftness. It has been said Docker is the first virtualization technology 
ready for the DevOps age.

What is DevOps? You should know about that as well.


The Software

A decade ago, COTS (commercial-off-the-shelf) software was the way to go. Why 
go through the trouble of building your own software from scratch when Oracle, 
Microsoft and SAP could provide you with all (or at least most) of the 
capabilities?

If employees grumbled about such software being sometimes difficult to use, 
well, they were getting paid to use it, right?

 These days, however, businesses are finding that enterprise software is no 
longer in a supporting role, but is central to businesses maintaining a 
competitive edge. In many cases, this means the organization must build its own 
software, at least for those parts of the operation that provide the crucial 
competitive edge for the company.

Remaining competitive is a moving target, of course, as competitors are also 
busy sharpening their own products and services. Nowhere is this more 
pronounced than with large Internet-scale services such as Yelp, Facebook, or 
AirBnB, who live or die on beating their competition with more helpful, and 
easier to use, features. The days of asking users, or employees, to put up with 
fussy software are coming to an end.

Such pressure has brought about a new operating paradigm called DevOps, which, 
in name and in spirit, combines software development and IT operations into one 
cohesive workflow. Tightly integrating the development cycle of an application 
with the subsequent operation of that application can cut the length of time 
required to update a customer-facing or internal application. About 60 percent 
of CIOs plan to use DevOps to manage their software, IDC has estimated.

Microsoft has been filling out its portfolio of development software to support 
devops operations. IBM has set up a special consulting practice just for 
helping organizations get more into a devops-style workflow.

One user of Microsoft's DevOps tools has been the business services division of 
French telecommunications company Orange, which develops systems and software 
for other organizations.

"A few years ago, it was the norm to deliver good functionality on time and on 
budget," said Philippe Ensarguet, CTO at Orange Business Services. "Now, we 
have to deliver sooner and faster and better."

One question that dogs the modern business is how to offer something unique in 
this global, hyper-competitive market. This is where new forms of data analysis 
could help.


The Data

Data analysis, once chiefly the provider of numbers for PowerPoint 
presentations and executive dashboards, is increasingly shaping the strategies 
and operations for many organizations.

Of course, data-guided business decisions are nothing new. What is new is a new 
depth in the kind of insights that analysis can provide, as well a greater 
range of data that can be put to computerized scrutiny.

IBM, among other companies, has been ambitiously pursuing the additional ways 
data can be parsed through cognitive computing, which harnesses techniques of 
machine learning, neural networks and other approaches to better mimic the ways 
humans intuit insight from data.

And thanks to the open-source Hadoop data-processing platform, the use of which 
is growing in the enterprise, additional types of data can be mined for 
potential knowledge.

Hadoop excels at churning through vast reams of unstructured data, data not 
stored in a relational database but captured in text files or log files--all 
the stuff IT staff used to largely ignore, then routinely delete once it filled 
its coffers. But e-mail, the Web surfing habits of customers or server log 
files can provide insight into long term trends, daily operations or heretofore 
undiscovered customer preferences.

One such company that found a competitive edge with such big data, as it is 
often called, has been enterprise security services company Solutionary, which 
used a MapR-based Hadoop distribution to enlarge the set of services it offers 
for its customers.

Solutionary uses Hadoop to store and analyze the security and events logs of 
its corporate customers, so they can be alerted when suspicious activity may be 
taking place on their systems. Hadoop allows the company to store more data, at 
a cost considerably less than if it were to be stored on a data warehouse.

Using this additional data allows the company to offer a longer-view analysis 
to its customers about what is happening on their networks. It also allows them 
to perform predictive modeling on the data, potentially giving its customers 
earlier warning about security issues.

Hadoop allowed Solutionary "to get off an architecture where you had to be 
careful about what to put into it, and to a model where you could store 
everything," said Scott Russmann, Solutionary's director of software 
engineering.

--

Cheers,
Stephen


_______________________________________________
Link mailing list
[email protected]
http://mailman.anu.edu.au/mailman/listinfo/link

Reply via email to