Hi,

I'm looking for a way to visualize an image (dimensions somewhere between 
640x480  and 1280x960) from a calibrated camera + a dense depth map (estimated 
from a stereo pair). With this information, i can compute the 3D coordinates of 
the regions that produced each pixel, and I'd like to visualize that. Ideally, 
when the virtual camera is positioned like the physical camera was, the 
visualization should just look like the image, and the 3d structure becomes 
apparent when moving the camera. Does OpenSceneGraph provide a convenient way 
to do that (I'm unfamiliar with it, I'm looking for the right tools to do this 
task now)? If yes, are there examples that are close to what I want to do? I've 
looked at osgpointsprite, which seems kind of appropriate, when i take the 
blending out and reenable the depth test, however the sprites don't change size 
when the camera is moved, is that changeable? 

Best regards,
Stefan
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to