[OSGeo-Discuss] REGARD 2008: Call for papers & Tyler Mitchell (OSGeo) to be keynote speaker !!!

2008-05-25 Thread Thierry Badard

Hello,

Just some updates about the 2nd International Workshop on Mobile 
Geospatial Augmented Reality (REGARD 2008) which will be held in Quebec 
City on August 28-29, 2008 ...


1) Do not hesitate to propose a paper, submissions are accepted till 
June 9, 2008 !


2) Tyler (Mitchell) will be one of the three keynote speakers (thanks 
again Tyler ! ;-)). So, open source geospatial technologies will be well 
represented at this event at the crossroad of mobile geospatial 
technologies, augmented reality, location-based games and mobile 
education. It will contribute to improve visibility of OSGeo in the 
academic and (non-geospatial) industrial domains.


Come and enjoy with us at Laval University in Quebec City which 
celebrates its 400th birthday this year !


Th.

-

2nd International Workshop on Mobile Geospatial Augmented Reality
(REGARD 2008)

http://regard.crg.ulaval.ca
Laval University, Quebec City (Quebec), Canada
August 28-29, 2008



*Keynote speakers*

We are proud to announce that Tyler Mitchell, executive director of the
Open Source Geospatial Foundation (OSGeo, http://www.osgeo.org) will be
one of the three keynotes of the event! To know more about his
presentation, go to http://regard.crg.ulaval.ca/2008/index.php?id=15.


*Aims and scope*

Augmented Reality (AR) is a means of blending computer generated objects
or labels with reality so that both appear to be a part of your natural
environment. AR is beginning to mature as a subject field with
applications moving from pure academic research into industrial and
potential consumer areas. In recent years geographic data representing
real world features has increasingly been used for AR applications. In
addition, geospatial technologies have established many new services and
applications including navigation, decision support and modelling of the
surrounding environment from which mobile AR and location-aware
computing can now benefit in order to generate compelling spatiotemporal
applications.

The International Workshop on Mobile Geospatial Augmented Reality aims
at bringing together researchers, developers, users and practitioners
carrying out research and development in this field. The workshop will
provide a forum for original research contributions and practical
experiences of mobile AR, geospatial technologies and geoinformatics,
and mobile games, fostering interdisciplinary discussions in all aspects
of these three fields, and will highlight future trends in this area.
The workshop will be organized in a way to promote networking between
the participants, to initiate and favour discussions regarding
cutting-edge technologies in the field, to exchange research ideas and
to promote international collaboration.


*Topics of interest*

We invite submissions that address theoretical, technical, and practical
topics of related to mobile geospatial augmented reality. Suggested
topics include, but are not limited to the following:

1. Geospatial information and geoinformatics
  - 3D spatial modeling
  - Geovisualization
  - Geospatial Service Oriented Architectures and systems for mobile
distributed computing
  - Context-aware mobility and LBS
  - Geo-sensors and Sensor Web

2. Mobile augmented reality
  - Acquisition of 3D scene descriptions
  - Real-time and photorealistic rendering
  - Vision-based registration, object overlay and spatial layout
techniques
  - Display and view management
  - Interaction techniques

3. Mobile games
  - Location based games
  - Spatial data integration and 3D game engine
  - Mobile learning and mobile edutainment
  - Augmented and mixed reality in mobile games
  - Mobile gaming experience and gaming activities


*Important dates*

The workshop will be held on the 28th and 29th of August 2008. Here are
the important dates for the workshop:

1. Paper abstracts due  June 02, 2008
2. Full papers due  June 09, 2008
3. Notification of acceptance   July 01, 2008
4. Registration July 15, 2008
5. Final paper version due  July 15, 2008
6. Workshop August 28-29, 2008


*Instructions for authors*

The proceedings are expected to be published by Springer in the Lecture
Notes in Geoinformation and Cartography (LNG&C) series (see
http://www.springer.com/series/7418). The decision is currently pending.
Authors must submit full papers in English according to the Springer
formatting guidelines
(http://regard.crg.ulaval.ca/2008/UserFiles/File/instruct-authors-e.pdf).

The templates (Latex or Word template) for preparing full papers can be
downloaded here:

 - Download the Word template
   (http://regard.crg.ulaval.ca/2008/UserFiles/File/T1-book.zip)
 - Download the Latex template
   (http://regard.crg.ulaval.ca/2008/UserFiles/File/svmult.zip)

Nevertheless, *full papers must be submitted in PDF file format*!

The page limit for full papers is 12 pages. Manuscripts not submitted in
the provided style or having more than 10 pages will not be reviewed and

Re: [OSGeo-Discuss] Should OSGeo get involved in the Information Architecture realm and nurture the development of definitive spatial ontologies?

2008-05-25 Thread Miles Fidelman

Bruce Bannerman wrote:

We need robust debate on these types of issues if we are to progress
them.

  

Ok.. let's try this :-)

I see that there are two main ways of utilising spatial information:

- producing a pretty picture that helps people understand an issue. We
have a number of types of products that fall in this realm, including
Google Maps, Google Earth, Virtual Earth, Slippy Maps etc.

- as an input into structured analysis that is used as an aid to
answering a particular question and also as an aid to exploring
inter-relationships between spatial, business, scientific data etc. The
output from this analysis could be a 'map', but of equal relevance it
could be in tabular, graphical or textual form. This is the realm of
traditional spatial analysis, image analysis or a range of spatial
products that I like to term 'Spatial Intelligence Frameworks' e.g.
Cohga's Weave, NGIS' GeoSamba, ESRI Australia's Eview. 
  
I don't think this dichotomy holds up under close scrutiny.  I don't see 
that much difference in the cognitive processes or computing tools 
involved in "producing a pretty picture that helps people understand an 
issue" and "structured analysis as an aid to answering a particular 
question" or "exploring inter-relationships between ... data."  Is not 
"structured analysis" part of producing any form of useful 
presentation?  In general, it's HARDER to organize and present issues to 
an audience that is not already familiar with the intricacies of an issue.


This implies that one needs more powerful tools, and more flexible data 
representations, to produce pretty pictures than to simply perform a 
specialized analysis.  A specialized analysis is amenable to a 
specialized tool.  The broader the range of analyses one wants to 
perform, and the broader the range of presentations that one might want 
to use to illustrate an issue, the MORE powerful and flexible the tools 
one needs - even more so if one wants to provide interactive 
capabilities to the audience of the "pretty picture."


Tools that support breadth, depth, and flexibility, coupled with 
ease-of-use and a touch of elegance, are far harder to build than those 
that support more narrowly scoped problems.  As a simple example: yes 
you can produce pretty pie charts using a drawing program, and you can 
perform incredibly powerful statistical analyses using SPSS or 
Mathematica, but you can address a far larger set of problems using a 
spreadsheet with graphics capabilities, particularly if the spreadsheet 
can tap into SQL databases, and you have a library of specialized macros 
available.



Throw into this the big picture issues that we are facing, e.g. Climate
Change, Water Shortage (in Australia) etc that require analysis at a
continental or global scale and we have a big problem.

How can we as an industry help this work to progress quickly with
minimal impact on the analysis, minimal double handling of data and in
many cases the use of dynamic data from multiple sources?
  



In the end, I suspect that we will need community driven involvement to
get it right. Communities of practice (like the geoscience community)
will need to work together to develop *their* profiles describing
*their* data. 


Is it an OSGeo responsibility? Probably not. I take the point of your
earlier email that OSGeo is predominantly about OS software.
  



When you consider the analysis requirement for spatial data, I suspect
that we as an industry may be heading in the wrong direction. 


Some of the issues that are are attracting a lot of effort are about
simplifying spatial data (GeoRSS, GeoJSON, BXFS etc). These appear to be
about catering to the 'pretty picture' use of spatial information.
  
I'm sort of driven to the opposite conclusion.  The more that data 
profiles are developed by specialized communities, the less likely that 
different data sets will be amenable to combination and correlation to 
support complex, cross-discipline issues such as climate change.


In one direction lies the need for anyone, working on a complicated 
problem, to understand in great detail all the overlapping disciplines 
that might be involved.  In the other direction lies framing higher 
levels of abstraction that allow examination of different types of 
ordering and interactions.


The example that comes to mind is systems engineering (my own 
discipline, as it turns out).  Yes, a systems engineer has to understand 
quite a bit about all the disciplines involved in building a system (or 
these days, a system-of-systems).  If you're building an aircraft, you'd 
better understand a lot about aeronautics, avionics (including hardware, 
real-time software environments, specific algorithms), and so forth.  
But the discipline involves understanding interactions and tradeoffs, at 
a higher level.  It's been a long time since I've written a large 
program, or designed hardware - and I haven't kept up with the 
intricacies of today's development tools - but makin

Re: [OSGeo-Discuss] Should OSGeo get involved in the Information Architecture realm and nurture the development of definitive spatial ontologies?

2008-05-25 Thread Bruce Bannerman
Hi Jo,

Thank you for your considered reply (...and no, I don't consider it
trollish   ;-)   )


We need robust debate on these types of issues if we are to progress
them.


OK, I'll try and put some more context on the original query.


I see that there are two main ways of utilising spatial information:

- producing a pretty picture that helps people understand an issue. We
have a number of types of products that fall in this realm, including
Google Maps, Google Earth, Virtual Earth, Slippy Maps etc.

- as an input into structured analysis that is used as an aid to
answering a particular question and also as an aid to exploring
inter-relationships between spatial, business, scientific data etc. The
output from this analysis could be a 'map', but of equal relevance it
could be in tabular, graphical or textual form. This is the realm of
traditional spatial analysis, image analysis or a range of spatial
products that I like to term 'Spatial Intelligence Frameworks' e.g.
Cohga's Weave, NGIS' GeoSamba, ESRI Australia's Eview. 


I fall into the second camp and try to implement systems that help end
users to explore and better utilise their data.


For effective analysis to be undertaken, you need to understand your
data and ensure that there are appropriate aspatial attributes to query
and analyse to find an answer to your problem.

While this is relatively straight forward for project work where you
control the data capture and QA processes, it starts becoming very messy
as soon as you start to try and take advantage of data captured by other
people and organisations.

Typically we find that another organisation has captured data describing
the same geographic phenomena for a different purpose, modelled the data
differently, with different fields and data types. This requires lost
time and effort in trying to massage the data into a format that we can
use and requires compromises in what can be considered an acceptable
outcome.


Throw into this the big picture issues that we are facing, e.g. Climate
Change, Water Shortage (in Australia) etc that require analysis at a
continental or global scale and we have a big problem.

How can we as an industry help this work to progress quickly with
minimal impact on the analysis, minimal double handling of data and in
many cases the use of dynamic data from multiple sources?


This is the context in which I made my original post.



As I discussed, I think that the geoscience community is showing us a
potential way forward with their community work developing the GeoSciML
profile. Anyone who has worked with geological data will appreciate the
magnitude of their accomplishments to date. This includes a way of
describing one of the most abstract types of spatial data an a
consistent way that can be understood by people of different cultures
and different languages.


This effort has taken a community four to five years to develop to its
current state with considerable effort.

How do we get consistent schema / ontologies / profiles for other
spatial phenomena? 


You are right in that it could be a GSDI responsibility. It could also
be an Enterprise Architecture responsibility (e.g. FEA Data Reference
Model).

In the end, I suspect that we will need community driven involvement to
get it right. Communities of practice (like the geoscience community)
will need to work together to develop *their* profiles describing
*their* data. 

Is it an OSGeo responsibility? Probably not. I take the point of your
earlier email that OSGeo is predominantly about OS software.


Is this an issue that OSGeo can help with?  Possibly.



When you consider the analysis requirement for spatial data, I suspect
that we as an industry may be heading in the wrong direction. 

Some of the issues that are are attracting a lot of effort are about
simplifying spatial data (GeoRSS, GeoJSON, BXFS etc). These appear to be
about catering to the 'pretty picture' use of spatial information.


I'm regularly seeing serious efforts to address the analysis use of
spatial data (e.g. GML 3 and complex features) ridiculed.


I'm not saying that there is no use for the pretty pictures. There
certainly is and Google in particular is catering to this very well and
increasing the awareness of spatial information amongst decision makers
and the public alike.

Meanwhile 2050 is fast approaching, if we are to believe the climate
change predictions.



Bruce Bannerman













signature.asc
Description: This is a digitally signed message part
___
Discuss mailing list
Discuss@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/discuss