Hi Andy,
I'm a bit cold on the topic of HMD's, patly cause I'm still an sleepy
this morning, and partly because I haven't yet had the please of
personally working with a HMD.
I have done lots of stereo work, but all with an external display
surface, such as monitors, powerwalls, reality centres, immersive
workbenches etc. HMD's differ from all these system is that how
stereo needs to be set up. The external display surfaces require you
to calibrate to the distance and position of the viewer from the
display surface, as the virtual parallax distance (the fusion distance
in the OSG setup which is in model coords) must be calibrated to the
distance that the display surface is away.
However, HMD's are very different, their isn't a single display which
the images apear, but two seperate displays. This means that instead
of having to alter the projection matrix to account (by shearing) for
the external display you actually have two fixed projection matrices.
The amount of shearing of the projeciton matrices depends on eye
seperation and the screen distance, so in the case of HMD's the screen
distance should irrelevant.
This leaves what should be happening to the view matrix - for both the
HMD and the external display surfaces systems you still need to
translate the view matrix for eye, and the amount is by half the
inter-occular distance scaled by 1.0 for 1:1 mapping being between
physical world and virtual world, and using a scale for god eye view
or ant/melocular eye views scales. This is where the fusion distance
comes in, its the distance in the virtual world that is scaled to the
screen distance to get the right eye seperation. For the drive and
flight manipulators this distance is kept constant, but for the
trackball and terrain maipulators use the distance between the eye
point and point of rotation as the fusion distance - this keeps the
point of interest glued to the display when you have an external
display surface. For the case of the HMD this won't map indentically,
but you should probably set the screen distance help scale the eye
offset to a natural distance - but this will be beyond the actual
displays.
Robert.
On 12/15/05, Andy Lundell <[EMAIL PROTECTED]> wrote:
I'm trying to set up stereo in my application. Primarily this would be used
with HMD's, but anaglyph mode is handy for testing at my desk.
My problem is that the eye separation is not working at all the way I expect
it to. The two view points are much farther apart than I expect.
Here is the code I'm using to set up my stereo
---
if(!pDS) pDS = osg::DisplaySettings::instance();
if (Shared->stereo )
{
pDS->setStereo( true );
pDS->setStereoMode( (osg::DisplaySettings::StereoMode)STEREO_MODE );
pDS->setDisplayType( osg::DisplaySettings::HEAD_MOUNTED_DISPLAY );
pDS->setScreenDistance( 1.0f ); //unit value.
pDS->setEyeSeparation( Iod * 0.0254f ); //2.54cm / in
float fusion;
fusion = Converge * (farclip - nearclip);
sceneView->setFusionDistance(
osgUtil::SceneView::USE_FUSION_DISTANCE_VALUE
, fusion * 0.0254f); //2.54cm / in
}
sceneView->setDisplaySettings( pDS );
---
I'm assuming that if I set fusion to the Farclip value for testing purposes
the two viewports should be roughly parallel. (Converge = 1.0) My simulation
units are inches so I've got Iod set to 2.0. The documentation mentions
meters so I threw in that conversion. (but I've tried it without it.)
I wind up with an image that looks like the two eyes are placed several feet
apart.
Any hints as to where I'm messing this up would be appreciated. Thank you.
-Andy
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/