On Thursday 11 Aug 2005 15:38, Dave Martin wrote: > On Thursday 11 August 2005 14:23, Jon Berndt wrote: > > The red and green/blue images could be registered better in > > the post-processing phase. However, had I done that, I would > > have had to crop the images more horizontally. I didn't feel > > like doing that at the time. It was sort of a > > quick-and-dirty post-processing effort. I agree, though, > > that it would be cool to have a stereo flight simulator. > > I've got no idea on the mechanics of the visuals though - > > how that could be implemented. > > > > Jon > > I was looking into driving a stereo HMD thru FlightGear a > while back (all theory - nothing practical yet). > > One idea I had was to produce the offset visuals using a > dual-cpu system with 2 instances of FlightGear's > 'out-the-window' engine running as if the cpus were 2 > networked systems doing the same. The advantage (I presumed) > would be that the video frames would be closer to being synced > than if 2 seperate machines were used. Either 2 video cards or > one very-high performance one running 1 display per > framebuffer could be used. > > The FDM and everything else could be run a networked machine > or you could maybe step up your main system to quad cpus and > use the same 'locally-networked' trick. > > Dave Martin.
If you have a high enough frame rate it might be easier to simply multiplex the views from a single instance of FG, switching the viewpoint between each eye on an alternate frame basis. You'd then need to feed the alternate frames to the left and right eyepieces in the 3d glasses and I dunno how they work - if they need two simultaneous signals you might have to buffer one frame from each pair within FG to be able to output both at the same time. LeeE _______________________________________________ Flightgear-devel mailing list [email protected] http://mail.flightgear.org/mailman/listinfo/flightgear-devel 2f585eeea02e2c79d7b1d8c4963bae2d
