Torben,

Thanks for addressing my questions and providing that information.

We've relied on the projectors and associated software (Mersive) to do
the distortion correction and edge blending but a cheaper
(monetary-wise) approach is always worth investigating...

Thanks,
-Shayne

-----Original Message-----
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Torben
Dannhauer
Sent: Tuesday, August 21, 2012 4:15 AM
To: osg-users@lists.openscenegraph.org
Subject: Re: [osg-users] post-rendering warping and off-axis projection

Hi Shayne,



> 
> On the distortion correction, is there really any need for pixel level

> distortion? That would seem very expensive. I would think that 
> textureCoodinate distortion is more than sufficient with a reasonable
mesh resolution. I haven't seen the need for pixel level distortion in a
spherical dome display.
> 


distortionNG supports vertex distortion and  texCoord distortion, pixel
wise distortion is planned but not yet realized because we do not have a
use case for pixel wise distortion.
Theoretically it is the most flexible distortion but it thwarts a lot of
OpenGLs image quality enhancements like anti-aliasing and makes the
image noisy, so you need to apply a filter at a post rendering stage.
It could be realized in shaders as it is done in the current( old)
osgVisual distortion module, its perfomance is aceptable.

With vertex(mesh)  distortion and texCoord-distortion you can achieve
identical results but I personmally prefere the vertex distortion for
the follow reason:
If you configure distortionNG to display the distortion Mesh, you can
use thes mesh to setup/control each channels correct alignment, whereas
with textCoord you cannot do that.


> 
> Does osgVisual support an interface for providing asymmetric viewing 
> frustums for each channel? What I mean by this is can I specify the 
> clipping bounds (i.e. rotation, widening or shrinking) for each
channel frustum so that I can effectively tile the channels in a dome
display optimally?
> 


You can apply any projectionmatrix OSG/OpenGL allows, it does not
introduice any restrictions.


> 
> How does osgVisual deal with edge blending of the channels or do you
rely on the projectors to do this? 
> 


The blending is performed in the distortion Modules, the projectors are
COTS devices, only a better color management could be useful to ensure
an identical color temperature on each channel.


-- So why is distortionNG still marked as experimental if it works like
a charm?!
Maybe I should explain the current situation:

The above mentioned 6 channel helicopter simulator uses a proprietary
commercial setup system: It is camera based and its output is the
blendmap, the projection matrix and the distortion matrix used by
distortionNG. It works like a charm, but it is very expensive.

For the academic usage osgVisual aims for, a cheaper solutions should be
available: 
Therefore I'm also working on a mouse based setup method. This is very
easy for vertex/texCoord distortion setup, but it is not so easy for the
projection Matrix or the blend map. The manual setup mode is still unter
construction.

The manual setup should be done in the following order (not yet
finishend, it may contain errors)

* Position the projectors to cover every desired sqare inch of the
screen, ensure that there is a reasonable overlaping of the projectors
to use for blending purposes.
* In each channel display the distortion mesh and align it with the
mouse and setup a regular mesh structure across all channels on the
whole screen. ensure that each distorted channel covers a regular,
ideally rectangular area from the simulator eye point perspective, this
simplyfies the design of the projection matrix.
* Not yet solved because problematic: The blendmaps must be "designed"
interactively with the mouse. The blend function between black and wite
must not necessarily be linear, but maybe it can be simplyfied be
regarding it as linear. The black blending edges are simple, they are
the edge of the channel projector (blending maps are not distorted!) .
The inner (white) blending edges have to be defined by mouse. If
required the non-linear blend function must be definable.
* Not yet solved because problematic:  the projection Matrix has to be
defined. To set it up correctly, the position of each channel relative
to the simulators eye point must be determined. A cheap theodolith may
be required. Finally the projection matrix has to be defined by this
measured angels relative to eypoint. How to do this interactively I have
no clue (yet) :)

The distortionSet itself is a container which can by saved and restored
(like a path file of the animation manipulator) once it is configured by
one of the above methods.

Finally one thing to mention: Lots of this tasks may be easy - I simply
do not know because my time is limited due to my PhD and my daily work
(both not in this area) and it is hard to get free hours and anough
power to work on distortionNG. I hope this changes dramatically after my
PhD is finished. Once distortionNG  is functional I'll try to submit it
to core OSG.


Thank you!

Cheers,
Torben[/b]

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=49382#49382





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to