A while ago, I asked for some ideas on a paper for the future of GIS. A
number of people sent some suggestions, and I promised to give it back to
the list. Its fairly long, so read it only if you are in the mood.
The views expressed were a result of a small amount of thought, some
alcohol and a rather blissful comtemplative period spent on an island in
the middle of the Great Barrier Reef. I fully accept its limitations, and
please remember that it was a paper prepared for a talk presented first up
after the conference dinner, and was not to be published. Disagree by all
means, but don't get angry. It was not meant to be rocket science and it
has warts.
A paper given to a FUNGIS (Far North Queensland GIS Users Group) on the
future of GIS.
FUTURE OF GIS
Introduction
For a start, lets stop talking about GIS. What we perceive to be GIS will
only be a small part of the technology we use. The systems we have been
using have much broader application than the systems we sit in front of
each day. A better term would be spatially enabled software, or spatial
software for short.
So, how does one predict the future? If I knew I wouldn't be here, I'd be
betting on horses. I have been shown ways that are supposed to enhance
your capabilities. About 5 years ago in another life, I worked for a
multinational corporation. We spent one day of our own time doing a
visioning exercise to see where the company was going. In one exercise we
drew pictures of where we thought we would be in 10 years time. Quite a
few of us drew barely recognisable pictures of us working away on laptops
under palm trees, connected to the office by mobile communications. About
a year ago, I tried it, and frankly it's quite uncomfortable. Sand in the
keyboard and being surrounded by semi-naked backpackers were quite
distracting.
I wrote the first draft of these notes sitting on a rock ledge meters from
a coral reef. I used a notepad (the old fashioned type) and a pen. I then
typed it into my computer when I got back to the office. Why? During the
previous weekend my mobile phone got wet when out fishing and it no longer
works. I wasn't going to risk my laptop in a sea kayak.
At this point you may well ask what has this got to do with GIS products,
but I just wanted to point out that having the technology will not
automatically change the way that people do things. Visions of the brave
new electronic world sometimes forget this, and whatever the technologists
envisage may not happen because of people factors.
Attitudes
It is only recently that I have heard industry leaders talking about making
software/ hardware so that it assists people do their work, rather than
making people think like computers. It is fortunate because this was the
reason that I got into computers 14 years ago. I was involved with using
expert systems for providing land management advice. Then about 12 years
ago, I wanted to use spatial information as part of the background data, so
I looked at GIS. I started using it because it was frustrating to have to
ask someone else to do something that they did not understand, just because
they had spent the last year of their life learning the obscure commands to
do so.
Things have changed somewhat since then, but they will continue to change.
To predict these changes we need to understand the factors that have
influenced these changes and understand what future changes are in store.
I'm sure that I have read a calendar quote along the lines "Today is
yesterday's future". So what are the factors that have allowed GIS to
become mainstream? What has changed to allow the systems we use today
evolve from those old systems. I don't miss the weeks of digitising using
cryptic command written in a very large book so that we could display poor
quality maps on the screen, print even worse maps, and perform incredibly
complex algorithms that had no bearing on the real-world decision process.
You could expect that any trends that are driving the technology in the
last few years would continue for at least a short while. The things that
have driven change over the past few years could be broadly categorised
into the following list:
1. The Software Bank
2. Computing power
3. Peripherals (including the internet)
4. The data bank
5. Standards
6. Peoples attitudes
7. Government attitudes
Software Bank
Without doubt, the largest impact on the software we see has been due to
the accumulation of software. Almost no piece of software today is ever
built in isolation. We use bits of other programs or entire other programs
written by others to make our software work. Often applications are simply
connecting 3 or 4 pieces, each of which that do what they were designed to
do a lot better than if we were to sit down and try to repeat the process.
Even Microsoft use this approach and it is no fluke that the spell checker
inside of word looks like the one inside of other packages.
This ease has allowed a niche to develop where small firms can develop
software for specific purposes - applications that are specific to one
industry or even one client's job.
On the spatial front, one product we are using is MapInfo's Map X object,
the heart of the mapping functionality in MapInfo which can be embedded
into any application. This product has the new features enabled into it
before they are released into the main flagship product MapInfo
Professional. ESRI has an equivalent product, but some third party vendors
have the equivalent that works with both formats. These same types of
objects are used to create the web server technology used for publishing
maps to the web.
These modules or objects allow software to be developed more quickly, and
therefore software can be customised more readily to do specific tasks. It
is only recently that we have started to see these types of products appear
on the market. The directional software fitted to many cars now that uses
GPS and GIS technology to tell a driver how to get to a particular location
are derivatives of the spatial industry, but the users have no idea that
they are using a GIS.
I think that these types of applications will become more the norm. These
sorts of applications are certainly a large part of our business, and I
believe will continue to be.
The increased number of users and the internet has meant that more and more
people now have access to information, so centralised server database
technologies are becoming increasingly popular again. Spatial technology
has gone right along with it as mainstream databases like Oracle embedding
spatial technology as part of the system. Oracle version 8 now has a field
type that stores spatial information about the database record similar to a
GIS, and allows spatial enquires directly from within Oracle. Delivery of
this information over the web is now commonplace. The speed of delivery of
maps is currently a hindrance as is the base data to make these systems
operate. The entry into the market of these large database companies is a
signal that spatial technology has finally got out of the cartography
section and into the IT section.
Raw Computing Power
I do not need to tell anyone how quickly computers are increasing in speed.
You only have to go shopping six months after you bought your last
computer and you know how quick it went out of date. Most of the
applications that previously were only released on mainframe or
workstations are now released on PC's. I have as much hard disk space (and
probably memory and chip speed) on my laptop as the mainframe that was
running the centralised system for the State when I started in computing.
But marveling about miniaturization and speed of todays computing is like
imagining what today's cars would be like back in the early 1900's. It is
really an extrapolation of already existing technology. If you really want
a head-spin, read something about nanotechnology, where they talk about
building computers by manipulating atoms. This technology could deliver
unheard of computing power and storage spaces in an object the size of a
speck of dust. Reproduction of this technology could be done at no cost
because it replicates itself. The latest pc@uthority has an article on it.
It is very scary stuff, and makes genetic engineering seem clumsy because
they could essentially create a new life form from atoms and not even be
restricted to the constraints of gene structures. They have already
achieved some aspects of this technology, and suggest that a computer based
on it may be achieved in our lifetime.
The impact of this has simply been that individuals now have the resources
to do what only institutions or large corporations could do previously.
More users mean a faster rate of change and more demand for that change.
It also has meant that the development cycle for software has reduced and
more people can do it.
Peripherals (including the internet)
When I talk about the peripherals, I mean the technology that allows us to
communicate with our computer and others. Until we can communicate
telepathically, we need to be able to communicate with the computer.
How many people remember when you had to type in commands to make a program
work (some systems still require you to do this)? How about punch cards?
Voice recognition software is now so commonplace that it is easy to
imagine that we will be using voice commands more. It is touted as being
the next Operating System, but it is hard to imagine how that would work in
an office situation. It would like being in a library where everyone read
aloud. I think that is when I'll try out my palm tree option again.
GPS is the peripheral likely to have large impact on the spatial systems,
because it is the gadget that says where it is on the earth. There are
applications around that allow users to take notes in the field and the
location is automatically associated with that note. Harvesters, mine
equipment and fertiliser applicators are all currently being tracked using
GPS and use spatial technology to exert some form of control over its
operation. Cars now know where they are and this allows them to work out
how to get somewhere else.
I heard recently that the US will be putting GPS chips in mobile phones.
With the resulting price drop of the chips, the applications of this
technology will again shift to a new level. Assets will no longer need to
be managed, but will simply tell a central computer where they are and what
needs to be done. Animal and people management will be simpler when they
have embedded tracking devices. Anyone who remembers a movie called Logans
Run should at this point be squirming in their seats.
Probably peripheral to have the largest influence on computing in recent
times is the Internet. While most people do not usual consider it to be a
peripheral, it is really no more than a large network that we all can join
easily. It also has a standard set of protocols that no matter what
software you use; you can use the data I want to let you. The number of
users is potentially much greater. Delivery over the web is really the new
frontier.
Its impact has been great because it has widened out the potential users of
any software we develop, and allows us to get help more quickly than ever
before. When I started writing these notes, I put out a request to the
MapInfo list for any suggestions. This list is read by more than 1000
people last time I checked. I had several good suggestions within a couple
of hours from around the world, and quite a few requests for copies of the
paper. Similarly when I have asked for an answer to a particular problem.
Software sales and distribution over the web allows small firms to built
specialist software. This change in the way that software can be marketed
has precipitated changes in the structure of businesses. The ability to
market small, specific applications has simply meant that these niche
applications can be developed commercially because it is possible to market
to a wider number of users. In todays market, someone can broadcast that
they have a problem in the morning, a person somewhere else in the world
can write a program to fix the problem in the afternoon, and it can be
marketed worldwide that night. It does not have to be the world's best
software to be successful, provided it fixes the problem very quickly.
Companies that are large enough to be able to market ideas in the
pre-internet world would still be in the phase of a technician tying to
convince his boss that something should be done, and it would be months of
business plans before any software was likely to be produced.
Along a similar vein is a project by the Massachusetts Institute of
Technology called oxygen. This describes a system of small voice activated
"computers" that are also our Internet connection, mobile phones, radio,
television (and no doubt GPS equivalent, but they didn't mention it). It
would be supplemented by a larger system that exists in the environment
around the users. The technology sounds great, but ultimately will it work
in a workplace because of issues like everyone needing a cone of silence to
stop annoying their neighbours. More likely it the technology behind it
will be spotted by an outsider and used for something completely different.
This large network called the Internet has already had large impacts on
users of spatial information. Organisations are now looking at the
internet technology to serve its users as the same system can be used to
deliver information to its own members via an intranet/ internet and the
public via the internet. Using the centralised server approach reduces the
cost of interface software and operational issues.
The Data Bank
Eight years ago, I can remember demonstrating an expert system that could
predict the amount of available residential land for an area based on up to
14 different factors. I had the system operation for an area the size of
England. When I was asked about using it commercially, I had to reply that
the information that it was based was only available to governmental users.
The commercial price of that data would prevent its use.
We could not implement any system that was based on anything less than
continental scale because the data was either not available or restricted.
The adage then was that any GIS project was split roughly 95% data
collation, 5% analysis.
This has changed significantly, with massive datasets becoming more
available as public custodians relent to pressure to release the data or
commercial vendors being able to sell quantities to reduce prices.
Recently in my state of Queensland, the price of cadastral data dropped by
95%. You now can buy every parcel in the State for a mere $87,000. Much
less if you just want to produce printed maps. North Queensland could be
bought for about $8000. Even less if you talk to Wal today.
This will allow people for the first time. Even this low price does not
match the price in Victoria, where you can buy a per seat license for as
low as $2000.
Many datasets are being collated by public organisations and are being made
available freely.
The result? We will see more products in the future where the data is part
of the product. The car or boat navigation systems and the business
address finder are a couple of example. I have some others, but I'm not
telling you yet.
Standards.
We have seen the impact that open standards can have on the speed of
development, even if different groups want different flavours such as those
as loose as those used for the internet.
Standards for data exchange or even publishing the formats that the data is
stored can greatly speeds the rate that product can be developed because it
opens up a greater market to any one product. I don't think that any
product will survive if it doesn't support import/ export functions to and
from its major competitors.
There are already products available that can read MapInfo, ArcView and
AutoCad simultaneously. It doesn't look like SDTS will become the standard
base for all systems. I believe there will be a time that this sort of
functionality will be incorporated into the mainstream GIS as well. If
Microsoft continue to be involved, there will no doubt be a new standard.
The biggest potential impact from standards will be the new vector based
standard for VML. The largest impediment to delivering mapping over the
Internet is that the data transfer to the user is via a graphical image of
the map. This is fine when the speed of transfer is simply from the
computer to its screen, as this happens so fast that you don't really even
think that you are just interacting with a picture. However, when the
image is being sent down the wire, the transfer rate is slow. The answer
has been to make map interfaces on the web about the size of a matchbox.
VML allows data to be transferred through the Internet (or even over a
Intranet) as vectors. These vectors are then converted to an image by the
local machine at a much faster rate. The impact of this standard will only
be felt when internet browsers are capable of using this language and
internet web server technology makes this feature its standard transfer
protocol, but you can almost hear the programmers scurrying around doing
this as we speak. There is a web site called VML Source that will be
launched "later this "northern" summer.
Attitudes
One of the biggest impacts on the products we see now, as been the change
in attitudes of software developers.
Software developers and GIS professionals ignore users at their own peril.
They will not tolerate being treated like fools for so long. Hands up how
many people remember having a word processing group that you had to hand
your typing into, have it filed into a jobs list and then wait for it to be
done? Now hands up who do have a word processing group in their
organisation, and do their own typing - or don't even have a word
processing group in their organisation? Perhaps you have a good word
processing operator who checks the integrity of the document, tidies it up
a bit etc. The parallels with the GIS groups are obvious.
In the GIS industry, MapInfo was a product that recognised this a long time
ago. When I started in GIS, I deliberately chose a product that worked on
affordable platforms, and even the DOS version was simpler to use than its
competitors. At the time, it was scoffed at by "serious" GIS users, who
said it couldn't do everything that their system could. But the difference
was that I could do what I wanted to do at an affordable price. Now
everyone talks of empowering the users as though it was always the case,
but it is a relatively recent phenomena.
The people factor is an important factor in the adoption of new technology
and thus what products become mainstream. I wonder what happened to all
those personal organisers that were around a few years ago? The batteries
went dead on mine a long time ago.
Conclusions
There will be a place for the traditional GIS and GIS professionals. They
will be needed to bring the spatial data together, produce maps (yes there
will still be a need for printed material for a long time yet), do
enquiries that aren't quite standard, do quality assurance.
Spatial software will be operated by fewer GIS specialists, and more by
people who want to use the system for their work rather than the GIS being
the focus of their work.
This isn't the future, this is a current trend. GIS specialists that have
no other skills will need to evolve into either cartographers or IT
specialists who have an understanding of how the spatial data fits into the
overall information management. I have been saying for some years now that
the very existence of an organisation of GIS users such as FUNGIS started
out being will make as much sense as an organisation of word processor
operators. Now I am hearing the same thing from other people as well.
Fortunately FUNGIS has been evolving as well and its role as a lobby group
on issues such as data availability and exchange will keep its relevance.
More common than GIS will be spatially enabled products. These products
will use spatial technology with databases created and maintained by using
the more traditional GIS. We already see spatial data in our phone books
and car systems, but it will increasingly be incorporated into other
products.
For example, I can imagine a small credit card that has a GPS, spatial
software and the street network in its 2 terabytes of flash RAM, and its
sole purpose in life is to give you a continuous readout of where the
closest McDonalds is - Worldwide. And you get one free with every purchase
of 2 McHappy meals.
Who am I to say which course our future will take? If you melded concepts
from Frank Herbert's Chapterhouse Dune and Tolstoy's War and Peace, you
would end up with a view that there is an infinite number of paths our
future could take from this point in time, and certain powerful figures
will be trying to direct the future along a path they see or desire.
However, unless there is a large worm using some mind altering substance
to manipulate the future along one of those paths, the future could be more
influenced by small unpredictable events that we have no way of foreseeing.
We now have a lot more control in what a program does and how it does it.
The future of spatial products may be largely determined by the large
corporations that are currently involved, but then maybe some 14 year old
kid may just connect his dad's GPS to some data he got at school and the
latest Doom or Tomb Raiders development kit and come up with something that
redefines the whole spatial industry.
Robert Crossley
Trinity Software
10 Trinity Street
Parramatta Park
CAIRNS 4870
AUSTRALIA
Phone: 61-7-40314877
Fax: 61-7-40314810
email: [EMAIL PROTECTED]
web: www.trinitysoftware.com.au
----------------------------------------------------------------------
To unsubscribe from this list, send e-mail to [EMAIL PROTECTED] and put
"unsubscribe MAPINFO-L" in the message body, or contact [EMAIL PROTECTED]