Hi,

I'm developing an augmented-reality app in which I wish to overlay
real-world data, specifically hiking trails from the OpenStreetMap
project, on the phone's camera feed.

I'm getting this kind-of working, using the camera's SurfaceView
stacked on the GLSurfaceView (yes, that way round, as documented by
several on the web). However you can't always see both layers and in
any case I've seen posts on here suggesting you're not supposed to
overlay two SurfaceViews on each other.

However what I'm struggling to find is an alternative way of doing it.
The method which springs to mind is to draw the OpenGL layer without
using a GLSurfaceView (e.g. just a regular view) but can that be done?
An alternative method is to send the feed to the GLSurfaceView as a
texture but this seems a bit of a long winded way of doing it and if
there are simpler alternatives, I'd prefer to go that route.

On another issue related to the same app, on Android 2.1 (and below)
there is no way to use the API to find any of the camera field of view
parameters, focal length, and so on. The specification for the HTC
Hero does not mention any of those parameters, though I've estimated
in the field a horizontal field of view of just under 40 degrees. It
seems that for 2.1 and below one would have to use some sort of table
mapping phone models to parameters, but again googling is drawing
blanks. Is anyone aware of a list of models and camera FOV parameters?

Thanks,
Nick

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to