On projects - Especially all the google apps are great at solving the
dilemma - " You cannot manage what you do not measure and you cannot
measure what you do not see " but what happens when it sees something
the client does not like ?
We are using several of the toolsets from the google world in the realm
of project information can know be gathered from multiple sources with
the APIs in the cloud and the biggest impediment we have run across is
not the technology thats very effective but the clients desire to make
the models or attached SysBIM SCADA info sets say something that the
cloud data disproves
ie: We have contracts that involve construction due diligence and so we
have great terrain models with structures and suitable attributes of
issues attached which showed very clearly that the data repositories of
the City Santa Fe and the State CID have huge error problems that taken
together with to be polite " ancient work practices "on one project
alone have cost $5m which state wide would rate in multi million even
billion dollar waste issues. Both CID and CSF have very large
investments in problematical flat word technology and do not take
kindly to any of their problems being spotlighted and will immediately
hide behind lawyers and try and pass the buck or worse imitate the ostrich.
Like so many practices its a business logic people approach that will
take time to resolve but sadly we have created a GIGO world where the O
bit in the bureaucratic system will be defended to the death
Its commonly thought that SCADA - BIM is some kind of gigantic complex
parametric model ( Think the boasting about the supercomputer ) but its
not its multiple of SIMPLE systems such as the wikis / basecamp / Google
earth maps apps feeding a multidimensional model as the display piece
all verified in real time
Google is doing a great job as you have noted as most of there
connectors work fairly seamlessly but in some cases require tweaking to
upscale. The issue of GIS interface to the real world is a bit more
complex as the GIS flat data is still really only accurate to 60ft +-
which become problematical when related to a sphere. In the most recent
major satellite maps of NM when incorporated with the aerial photos and
maps it was found that error rates where as high as 42% which involves
mathematical tweaking which has its own verification issues non of which
would pass QA either ISO or 6sig. When ESRI moves it flat data to
multidimensional its using a impropriety system moving from flat to
multidimensional when todays world we are diving straight in the higher
dimensions. Th big kicker for all this is it need fast web usage for
every participant at all levels and very simple field accuracy
verification and thats a major major problem for GIS currently ( Just
overlay some of the SFC GIS output data on building blobs for the city
onto a real G earth or Nasa sat shots and you will see what I mean )
I think the best analogy for all the issues is to understand that APIs
used to live in a box ( small or big networked or not ) now they live in
the cloud and its up to you to sort out the hows and the whys and its
the last bit I think has Stallman upset.
If your interested here is what we use in the overview construction /
systems engineering industry for the API and protocol requirements
http://www.ideapete.com/workinglogic.html and
http://www.ideapete.com/introparametricmodeling.htm none require the
sole use Revit or Pro E or Catia but all the systems can integrate at
that level if needed and all are web usable
( : ( : pete
Peter Baston
*IDEAS*
/www.ideapete.com/ <http://www.ideapete.com/>
Owen Densmore wrote:
<div class="moz-text-flowed" style="font-family:
-moz-fixed">Interesting discussion on /.
http://tech.slashdot.org/article.pl?sid=08/09/30/2146250
When we started thinking about the sys admin issues for sfx
(http://sfcomplex.org/), we had to decide on how to address free web
services. One example we had used earlier was PBWiki, a very nice
wiki engine. Another was http://www.airset.com/ .. a very nice web
community site.
The principle we evolved was simply this: we can use any SaaS
(Software as a Service) system as long as any data we put into the
system was easily available in a standard format.
This means, for example, use of Google Calendar was fine: it is an
iCal system with easily downloadable calendar data in a standard
format. Note that GMail similarly passes the test: the mail data can
easily be captured via the IMAP or POP protocols.
PBWiki failed: there was no easy way to capture the format in a
standard format .. i.e. in a wiki markup language. Ditto for airset,
the system was too difficult to extract and place into any of the
usual CMS systems (Joomla, Drupal etc)
On the other hand, one of our projects is looking hard at deploying
web applications on Google Maps and Google App Engine. This looks
fairly safe: the main features we are using on Google Maps are lat/lng
which transfer to other GIS systems nicely. Similarly, the App Engine
uses python and Django templates, and all of the system is developed
"off-line" then uploaded. It's only oddness is its datastore, which
maps fairly nicely onto other sytems. We also are cautious about
having a Plan B .. i.e. how to extricate.
So our approach of standards and data extractability is one approach.
How do you handle this? Any horror stories? .. successes?
-- Owen
</div>
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org