Re: [osg-users] Osing OpenXR in OSG

2021-07-17 Thread James Hogan
Hi Robert,

On Sun, 11 Jul 2021 at 09:31, Robert Osfield  wrote:
> On Sun, 11 Jul 2021 at 07:43, James Hogan  wrote:
>> I'm not sure I follow this. Doesn't OSG's own stereo method use slave 
>> cameras too, or does it somehow avoid multiple cull traversals?
>
> The built in strereo is one of early parts of the OSG, so 20+ years old, and 
> can be found in osgUtil::SceneView.  The interbakky manages two cull and draw 
> traversals, but isn't thread aware itself, so just calls them in series.

Thanks, the SceneView code is what I was looking for, and it all kind
of makes sense in my head now I think. I've added a mode to hook into
SceneView stereo matrices callbacks and got something rough going,
which automatically looks for existing slave cameras using
FRAME_BUFFER, and even works(ish) with flightgear :-).
-- 
James Hogan

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAAG0J9-1mEysxrBsYMmG1BkjG12K8OReBL%3D-OX5LSwb4QJOutA%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-07-11 Thread Robert Osfield
Hi James,

On Sun, 11 Jul 2021 at 07:43, James Hogan  wrote:

> I'm not sure I follow this. Doesn't OSG's own stereo method use slave
> cameras too, or does it somehow avoid multiple cull traversals?
>

The built in strereo is one of early parts of the OSG, so 20+ years old,
and can be found in osgUtil::SceneView.  The interbakky manages two cull
and draw traversals, but isn't thread aware itself, so just calls them in
series.

Most modern OSG applications will use vsgViewer which was introduced in
OSG-2.x, this has the capability of doing stereo at the viewer level, but
it's up to the application to configure the master/slave cameras to create
the stereo.  The osgViewer still uses osgUti;::SceneView under the hood so
inherits it's stereo capabilities.  Personally I'd prefer to just implement
high level stereo uses master/slave Camera's in osgViewer and had a plan to
steadily replace SceneView usage, but never got there before starting the
VSG project.

The OVR_multiview functionality in the MultiView branch uses osgViewer
level setup of stereo, but it's at the application level it's one cull and
one draw traversal, the stereo happens entirely on the GPU.

Cheers,
Robert.

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAFN7Y%2BXOvYHBw_YF9Oikw1oENxbusjuGS6gP7aPuz%3DBNtUqscw%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-07-11 Thread James Hogan



Hi Mads

On 15 June 2021 18:42:29 BST, Mads Sandvei  wrote:
>> OSG already has a concept of stereo (which currently this code
>doesn't 
>interact with)
>OSG's multithreaded rendering works better with its own stereo method
>than 
>the slave camera method, so i would recommend integrating with this
>instead.
>For example, if a user uses DrawThreadPerContext, the main thread can 
>continue to the update phase of the next frame immediately when the
>last of 
>slave cameras have begun its draw traversals.
>With two cameras you get two separate traversals and the main thread
>may is 
>held up until the first camera is done with its draw, costing
>performance.

I'm not sure I follow this. Doesn't OSG's own stereo method use slave cameras 
too, or does it somehow avoid multiple cull traversals?

Cheers
James

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/39D6A8AF-8E95-4737-AA68-18F6A2844A44%40albanarts.com.


Re: [osg-users] Osing OpenXR in OSG

2021-07-09 Thread James Hogan



On 22 June 2021 23:10:32 BST, James Hogan  wrote:
>>  > Performance is currently terrible. CPU usage and frame times don't
>seem high, so its blocking excessively somewhere

Fortunately the main performance issue turned out to be my misinterpretation of 
XrSwapchainSubImage::imageArrayIndex as referring to the index of the swapchain 
images. With that fixed the xrEndFrame validation error is gone & performance 
is at least in the right ballpark :)

Cheers
James

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/9C69F877-339F-4790-A8C3-F0B0D8FF4242%40albanarts.com.


Re: [osg-users] Osing OpenXR in OSG

2021-07-04 Thread James Hogan
On Tue, 22 Jun 2021 at 22:15, James Hogan  wrote:
>
> On Tue, 15 Jun 2021 at 17:30, Robert Osfield  wrote:
> > My recommendation would be to move your code into it's own osgXR library, 
> > but stick with the osgViewer::ViewConfig approach as this as it should make 
> > it easier for developers to switch between desktop and VR configurations - 
> > a strength of your current implementation.
> >
> > Creating a separate osgXR library will allow developers to use it against a 
> > wide range of OSG versions, so won't need to do any updates, just link to 
> > osgXR set up the viewer configuration and away they go.  This also 
> > decouples the XR functionality from needing to be integrated within mainly 
> > OSG and being released as part of an official release.  I'm spending most 
> > of my time on the VSG project these days so have put the OSG primarily in 
> > maintenance mode, so new stable releases are off the table till I get the 
> > VSG to 1.0 (hopefully later this year.)
>
> Okay, that makes sense. I'll work in that direction and see how it
> goes. Thanks for the feedback!

FYI, I've separated it out into a separate library, which I'll work on here:
https://github.com/amalon/osgXR

The _visualInfo->visualid and _fbConfig of GraphicsWIndowX11 used for
X11 graphics bindings aren't externally accessible, but don't seem to
be needed in practice.

It also requires explicit integration into an application (since I
don't seem to be able to hook into Viewer creation), which in practice
is going to be needed anyway for proper VR support, so I'll go with
that.

Cheers
-- 
James Hogan

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAAG0J9_4A8VgaG-tJP%2BcdToh_mqgU38KXqBRpGuW9aDO1tp37Q%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-06-30 Thread Mads Sandvei
> Won't that result in fixed velocity objects not moving smoothly though, 
since the objects won't necessarily be in the positions they should at the 
time of display? (I'm not getting high enough frame rates yet for this to 
be apparent, so nvidia on linux not supporting async reprojection is the 
much larger cause of nausia! *sigh*). 

> I suppose in practice if you do the common sampling of time delta each 
frame it'd work out pretty steady, its only with flightgear's "perform so 
many 120Hz timesteps until we've caught up to the current time" that a 90Hz 
HMD refresh would result in jittery motion. I'll worry about it when I 
observe it! 
I read your note about flightgear too quickly and did not fully understand 
the problem properly, my apologies! My understanding is that the 120hz is a 
default value and can be changed to a multiple of the display refresh rate, 
which might be a better solution if jitter does become an issue, but i'm 
not very familiar with FG so i shouldn't comment too much. Either way, make 
sure you know it's a real issue before optimizing for it!

> Does that work out fairly straightforward in the end though? I suppose it 
depends on nvidia, which perhaps is why the person who did the 
> Blender work talked about doing a final DirectX frame copy, which sounds 
more heavyweight than sharing swapchains between DX and GL. 
I've never had any big issues with  WGL_NV_DX_interop2. I am forced to do a 
gpu-gpu copy, which is a fairly negligible cost, as the swapchains returned 
by WMR have attributes that prevent directly sharing these with OpenGL. 
Instead i share a second set of DirectX textures and then copy those back 
to the swapchains. 
If the blender guy means doing a gpu-cpu-gpu copy then that is certainly a 
lot more heavyweight.

Mads.
On Wednesday, June 23, 2021 at 12:10:45 AM UTC+2 James Hogan wrote:

> Hi,
>
> On Tue, 15 Jun 2021 at 18:42, Mads Sandvei  wrote:
> > I have some experience integrating OpenXR and OSG from my work on 
> OpenMW-VR.
> > I'll share some of what i've learned
>
> Ooh, thanks, I'll have a peak at how you've gone about it.
>
> > > OSG already has a concept of stereo (which currently this code doesn't 
> interact with)
> > OSG's multithreaded rendering works better with its own stereo method 
> than the slave camera method, so i would recommend integrating with this 
> instead.
> > For example, if a user uses DrawThreadPerContext, the main thread can 
> continue to the update phase of the next frame immediately when the last of 
> slave cameras have begun its draw traversals.
> > With two cameras you get two separate traversals and the main thread may 
> is held up until the first camera is done with its draw, costing 
> performance.
>
> Ah okay, thats very useful to know. I can see that resulting in
> preferential treatment for the stereo view configuration (but that is
> the case that matters most to me anyway...).
>
> > In my work this meant using a single doublewide framebuffer instead of 
> one framebuffer per eye. This is not a problem for OpenXR as you can create 
> a doublewide swapchain and use the subimage structure
> > to control the regions rendered to each eye when composing layers. I 
> haven't looked to closely at whether OSG supports attaching different 
> framebuffers per eye so that might be a moot point.
>
> Makes sense.
>
> > It's worth noting that OSG is adding support for the GL_OVR_multiview2 
> extension: https://groups.google.com/g/osg-users/c/__WujmMK5KE
> > It would be worth integrating this in the future as this would easily be 
> the fastest stereo method, though I don't have any personal experience with 
> it.
>
> Thanks. Unfortunately its still wholly in a separate branch of OSG AFAICT?
>
> > > Performance is currently terrible. CPU usage and frame times don't 
> seem high, so its blocking excessively somewhere
> > Comparing your code to mine the only notable performance issues, that 
> are under your control, is forcing single-threaded and the choice of stereo 
> method.
> > The code that is blocking is the xrWaitFrame() method, which is by 
> design. See what i wrote below about nausea. It is okay to delay 
> xrWaitFrame until the first time you need the predictedDisplayTime, but not 
> any longer.
> >
> > Forcing single-threaded is undoubtably the biggest issue for performance.
> > I see in your code a comment that the reason is so that no other thread 
> can use the GL context.
> > I have never touched openvr, so it's possible to openvrviewer has a good 
> reason for this concern. With OpenXR i don't think there is any good reason 
> for this.
>
> agreed, its mostly a hack to avoid having to understand how OSG uses
> multithreading straight away.
>
> > 
> https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_opengl_enable
> > The OpenXR spec explicitly demands that the runtime does not use the 
> context you give it except for a specific subset of functions you would 
> only call from the rendering 

Re: [osg-users] Osing OpenXR in OSG

2021-06-23 Thread Robert Osfield
Hi Guys,

I'm just lurking on this topic so can't provide guidance on low level stuff
at this point, on the high level side I provide a bit of background that
might be helpful.

I'd like to chip in is that OVR_multiview functionality integrated into the
MultiView branch will be rolled into the next stable release. The
MeshShaders branch was made off the MultiView branch so can also be used.

My thought is that the MeshShader branch would be a better basis for a
OpenSceneGraph-3.8 stable release rather than the present master, and as
master contains a big block of experimental shader composition code that is
only 60% complete.  The VulkanSceneeGraph project ended up kicking off
before I completed the work on the experimental shader composition side.
This project is now my primary focus so finding safe paths to progress
OpenSceneGraph without requiring a major chunk of year to complete is the
route to take.

For a osgXR library if it can work against OpenSceneGraph-3.6 and then if
MultiView/MeshShader branches are detected then OVR_multiview could be
used.  OVR_multiview does require custom shaders but pretty well doubles
the performance so can be well worth it.  In the test I did when working on
OVR_multiview I found that you could essentially render stereo at same cost
as mono - simply because the bottleneck for most OSG/OpenGL applications is
the CPU side, so even doubling vertex load on the GPU doesn't result in a
performance hit.

OVR_multiview is also supported in Vulkan but I haven't implemented it yet
in the VulkanSceneGraph, this is less critical though as the CPU overhead
of the VSG and Vulkan are so much lower than the CPU is far less of
bottleneck - the VSG without multiview will likely still be much faster
than the OSG woth OVR_multiview.

Cheers,
Robert.

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAFN7Y%2BUqExdGSdDN9s2v%2B3WUSt1BnCsLFiEYaCwWn-75hhbhCw%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-06-22 Thread James Hogan
On Tue, 15 Jun 2021 at 21:05, Gareth Francis  wrote:
> Not much to add specifically for openXR yet, but happy to test or debug if 
> blockers/quirks pop up (windows or linux, htc vive, nothing fancy)

Thanks Gareth!

Cheers
-- 
James Hogan

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAAG0J9_GtAdmzr-zRpivfJu_BM1WMS3Gj-QgWbWJ0xnGJTPOig%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-06-22 Thread James Hogan
Hi,

On Tue, 15 Jun 2021 at 18:42, Mads Sandvei  wrote:
> I have some experience integrating OpenXR and OSG from my work on OpenMW-VR.
> I'll share some of what i've learned

Ooh, thanks, I'll have a peak at how you've gone about it.

>  > OSG already has a concept of stereo (which currently this code doesn't 
> interact with)
> OSG's multithreaded rendering works better with its own stereo method than 
> the slave camera method, so i would recommend integrating with this instead.
> For example, if a user uses DrawThreadPerContext, the main thread can 
> continue to the update phase of the next frame immediately when the last of 
> slave cameras have begun its draw traversals.
> With two cameras you get two separate traversals and the main thread may is 
> held up until the first camera is done with its draw, costing performance.

Ah okay, thats very useful to know. I can see that resulting in
preferential treatment for the stereo view configuration (but that is
the case that matters most to me anyway...).

> In my work this meant using a single doublewide framebuffer instead of one 
> framebuffer per eye. This is not a problem for OpenXR as you can create a 
> doublewide swapchain and use the subimage structure
> to control the regions rendered to each eye when composing layers. I haven't 
> looked to closely at whether OSG supports attaching different framebuffers 
> per eye so that might be a moot point.

Makes sense.

> It's worth noting that OSG is adding support for the GL_OVR_multiview2 
> extension: https://groups.google.com/g/osg-users/c/__WujmMK5KE
> It would be worth integrating this in the future as this would easily be the 
> fastest stereo method, though I don't have any personal experience with it.

Thanks. Unfortunately its still wholly in a separate branch of OSG AFAICT?

>  > Performance is currently terrible. CPU usage and frame times don't seem 
> high, so its blocking excessively somewhere
> Comparing your code to mine the only notable performance issues, that are 
> under your control, is forcing single-threaded and the choice of stereo 
> method.
> The code that is blocking is the xrWaitFrame() method, which is by design. 
> See what i wrote below about nausea. It is okay to delay xrWaitFrame until 
> the first time you need the predictedDisplayTime, but not any longer.
>
> Forcing single-threaded is undoubtably the biggest issue for performance.
> I see in your code a comment that the reason is so that no other thread can 
> use the GL context.
> I have never touched openvr, so it's possible to openvrviewer has a good 
> reason for this concern. With OpenXR i don't think there is any good reason 
> for this.

agreed, its mostly a hack to avoid having to understand how OSG uses
multithreading straight away.

> https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_opengl_enable
> The OpenXR spec explicitly demands that the runtime does not use the context 
> you give it except for a specific subset of functions you would only call 
> from the rendering thread, which will have the context every time.

Thats probably where my multithreaded OSG was going wrong :-)

> Inspecting your code, the flow of openxr calls is very similar to my own and 
> i have no issues running the threading mode DrawThreadPerContext. But i 
> cannot speak for the other threading modes.
>
> > due to SteamVR changing GL context somewhere (a known bug, worked around in 
> > the swapchain abstraction
> My understanding is that the openxr spec doesn't actually forbid this 
> behaviour. It only limits when when the runtime is allowed to use the context 
> you gave it, not whether it binds/unbinds that or other contexts.
> This doesn't sound like behaviour anyone would want, though. Maybe an 
> oversight in the openxr standard?

Certainly annoying behaviour yes, It should be specified whether that
is permitted either way.
ftr: https://github.com/ValveSoftware/SteamVR-for-Linux/issues/421

> The runtime cost of verifying the OpenGL context after each of the relevant 
> functions is low since you're only doing it a handful of times per frame,
> so it might be a good idea to just wrap all of the mentioned methods in code 
> that checks and restores opengl context.
> Of course, the best would be if all vendors adopted reasonable behaviour.

Yes, thats sounds like the best we can do right now until it becomes
clearer whether the behaviour will be fixed.

>  > Advancement is ideally driven by the expected display times of individual 
> frames, i.e. the next frame should show the scene at exactly the moment when 
> it is expected to be displayed to the user to avoid jitter and nausia. This 
> may well be more of an app level concern (certainly is for flightgear which 
> AFAICT currently uses fixed 120Hz simulation steps), but a general 
> VR-specific viewer mainloop is probably needed in any case.
> This is the purpose of the xr[Wait,Begin,End]Frame loop, and why you're 
> passing the 

Re: [osg-users] Osing OpenXR in OSG

2021-06-22 Thread James Hogan
On Tue, 15 Jun 2021 at 17:30, Robert Osfield  wrote:
> My recommendation would be to move your code into it's own osgXR library, but 
> stick with the osgViewer::ViewConfig approach as this as it should make it 
> easier for developers to switch between desktop and VR configurations - a 
> strength of your current implementation.
>
> Creating a separate osgXR library will allow developers to use it against a 
> wide range of OSG versions, so won't need to do any updates, just link to 
> osgXR set up the viewer configuration and away they go.  This also decouples 
> the XR functionality from needing to be integrated within mainly OSG and 
> being released as part of an official release.  I'm spending most of my time 
> on the VSG project these days so have put the OSG primarily in maintenance 
> mode, so new stable releases are off the table till I get the VSG to 1.0 
> (hopefully later this year.)

Okay, that makes sense. I'll work in that direction and see how it
goes. Thanks for the feedback!

> Within the VSG community we've been discussing OpenXR integration as well,  
> Again I see this is type of functionality that a dedicate vsgXR library would 
> provide, rather than being integrated into the core VSG.  While I haven't 
> personally done any work in this direction I can certainly see that OSG and 
> VSG XR integration could well follow similar approaches even at the code 
> level then are entirely separate.  Potentially both efforts could draw 
> experience and knowledge form each other.

Yes, I'd be interested in following any equivalent VSG XR library.
I'll keep an eye on the vsg mailing list.

Cheers
-- 
James Hogan

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/CAAG0J99pDpJEqDep%3DGp1AyRDongzq1Yh2poO%3DodwpMKXbsPV9g%40mail.gmail.com.


Re: [osg-users] Osing OpenXR in OSG

2021-06-15 Thread Gareth Francis
Hi all,

I'm the one looking into vsg vr (at least when I get the time, slow
progress overall). See https://github.com/geefr/vsg-vr-prototype for
progress so far.

Only using openvr for the moment, but this looks very relevant, some
interesting concepts to consider if my stuff ends up with XR support as
well. Hopefully being vulkan can avoid some of the platform-specific issues
(hopefully)..

Not much to add specifically for openXR yet, but happy to test or debug if
blockers/quirks pop up (windows or linux, htc vive, nothing fancy)


On Tue, 15 Jun 2021 at 18:42, Mads Sandvei  wrote:

> Hi
>
> I have some experience integrating OpenXR and OSG from my work on
> OpenMW-VR.
> I'll share some of what i've learned
>
>  > OSG already has a concept of stereo (which currently this code doesn't
> interact with)
> OSG's multithreaded rendering works better with its own stereo method than
> the slave camera method, so i would recommend integrating with this instead.
> For example, if a user uses DrawThreadPerContext, the main thread can
> continue to the update phase of the next frame immediately when the last of
> slave cameras have begun its draw traversals.
> With two cameras you get two separate traversals and the main thread may
> is held up until the first camera is done with its draw, costing
> performance.
>
> In my work this meant using a single doublewide framebuffer instead of one
> framebuffer per eye. This is not a problem for OpenXR as you can create a
> doublewide swapchain and use the subimage structure
> to control the regions rendered to each eye when composing layers. I
> haven't looked to closely at whether OSG supports attaching different
> framebuffers per eye so that might be a moot point.
>
> It's worth noting that OSG is adding support for the GL_OVR_multiview2
> extension: https://groups.google.com/g/osg-users/c/__WujmMK5KE
> It would be worth integrating this in the future as this would easily be
> the fastest stereo method, though I don't have any personal experience with
> it.
>
>  > Performance is currently terrible. CPU usage and frame times don't seem
> high, so its blocking excessively somewhere
> Comparing your code to mine the only notable performance issues, that are
> under your control, is forcing single-threaded and the choice of stereo
> method.
> The code that is blocking is the xrWaitFrame() method, which is by design.
> See what i wrote below about nausea. It is okay to delay xrWaitFrame until
> the first time you need the predictedDisplayTime, but not any longer.
>
> Forcing single-threaded is undoubtably the biggest issue for performance.
> I see in your code a comment that the reason is so that no other thread
> can use the GL context.
> I have never touched openvr, so it's possible to openvrviewer has a good
> reason for this concern. With OpenXR i don't think there is any good reason
> for this.
>
> https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_opengl_enable
> The OpenXR spec explicitly demands that the runtime does not use the
> context you give it except for a specific subset of functions you would
> only call from the rendering thread, which will have the context every time.
> Inspecting your code, the flow of openxr calls is very similar to my own
> and i have no issues running the threading mode DrawThreadPerContext. But i
> cannot speak for the other threading modes.
>
> > due to SteamVR changing GL context somewhere (a known bug, worked around
> in the swapchain abstraction
> My understanding is that the openxr spec doesn't actually forbid this
> behaviour. It only limits when when the runtime is allowed to use the
> context you gave it, not whether it binds/unbinds that or other contexts.
> This doesn't sound like behaviour anyone would want, though. Maybe an
> oversight in the openxr standard?
>
> The runtime cost of verifying the OpenGL context after each of the
> relevant functions is low since you're only doing it a handful of times per
> frame,
> so it might be a good idea to just wrap all of the mentioned methods in
> code that checks and restores opengl context.
> Of course, the best would be if all vendors adopted reasonable behaviour.
>
>  > Advancement is ideally driven by the expected display times of
> individual frames, i.e. the next frame should show the scene at exactly the
> moment when it is expected to be displayed to the user to avoid jitter and
> nausia. This may well be more of an app level concern (certainly is for
> flightgear which AFAICT currently uses fixed 120Hz simulation steps), but a
> general VR-specific viewer mainloop is probably needed in any case.
> This is the purpose of the xr[Wait,Begin,End]Frame loop, and why you're
> passing the predictedDisplayTime returned by xrWaitFrame() on to
> xrEndFrame().
>
> https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#frame-synchronization
> In short: you don't have to care, OpenXR is already doing this for you.
>
> Perhaps this is the issue for 

Re: [osg-users] Osing OpenXR in OSG

2021-06-15 Thread Mads Sandvei
Hi

I have some experience integrating OpenXR and OSG from my work on OpenMW-VR.
I'll share some of what i've learned

 > OSG already has a concept of stereo (which currently this code doesn't 
interact with)
OSG's multithreaded rendering works better with its own stereo method than 
the slave camera method, so i would recommend integrating with this instead.
For example, if a user uses DrawThreadPerContext, the main thread can 
continue to the update phase of the next frame immediately when the last of 
slave cameras have begun its draw traversals.
With two cameras you get two separate traversals and the main thread may is 
held up until the first camera is done with its draw, costing performance.

In my work this meant using a single doublewide framebuffer instead of one 
framebuffer per eye. This is not a problem for OpenXR as you can create a 
doublewide swapchain and use the subimage structure
to control the regions rendered to each eye when composing layers. I 
haven't looked to closely at whether OSG supports attaching different 
framebuffers per eye so that might be a moot point.

It's worth noting that OSG is adding support for the GL_OVR_multiview2 
extension: https://groups.google.com/g/osg-users/c/__WujmMK5KE
It would be worth integrating this in the future as this would easily be 
the fastest stereo method, though I don't have any personal experience with 
it.

 > Performance is currently terrible. CPU usage and frame times don't seem 
high, so its blocking excessively somewhere
Comparing your code to mine the only notable performance issues, that are 
under your control, is forcing single-threaded and the choice of stereo 
method.
The code that is blocking is the xrWaitFrame() method, which is by design. 
See what i wrote below about nausea. It is okay to delay xrWaitFrame until 
the first time you need the predictedDisplayTime, but not any longer.

Forcing single-threaded is undoubtably the biggest issue for performance.
I see in your code a comment that the reason is so that no other thread can 
use the GL context.
I have never touched openvr, so it's possible to openvrviewer has a good 
reason for this concern. With OpenXR i don't think there is any good reason 
for this.
https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_opengl_enable
The OpenXR spec explicitly demands that the runtime does not use the 
context you give it except for a specific subset of functions you would 
only call from the rendering thread, which will have the context every time.
Inspecting your code, the flow of openxr calls is very similar to my own 
and i have no issues running the threading mode DrawThreadPerContext. But i 
cannot speak for the other threading modes.

> due to SteamVR changing GL context somewhere (a known bug, worked around 
in the swapchain abstraction
My understanding is that the openxr spec doesn't actually forbid this 
behaviour. It only limits when when the runtime is allowed to use the 
context you gave it, not whether it binds/unbinds that or other contexts.
This doesn't sound like behaviour anyone would want, though. Maybe an 
oversight in the openxr standard?

The runtime cost of verifying the OpenGL context after each of the relevant 
functions is low since you're only doing it a handful of times per frame,
so it might be a good idea to just wrap all of the mentioned methods in 
code that checks and restores opengl context. 
Of course, the best would be if all vendors adopted reasonable behaviour.

 > Advancement is ideally driven by the expected display times of 
individual frames, i.e. the next frame should show the scene at exactly the 
moment when it is expected to be displayed to the user to avoid jitter and 
nausia. This may well be more of an app level concern (certainly is for 
flightgear which AFAICT currently uses fixed 120Hz simulation steps), but a 
general VR-specific viewer mainloop is probably needed in any case.
This is the purpose of the xr[Wait,Begin,End]Frame loop, and why you're 
passing the predictedDisplayTime returned by xrWaitFrame() on to 
xrEndFrame().
https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#frame-synchronization
In short: you don't have to care, OpenXR is already doing this for you.

Perhaps this is the issue for the openvrviewer, that openvr doesn't have 
this and so isn't automatically synchronized?
My interpretation of xrBeginFrame is that it exists precisely so that the 
next frame never begins rendering operations before the runtime is done 
compositing the latest xrEndFrame.

The only Nausea element you have to consider is when to locate an XrSpace.
When locating an XrSpace, what you get is a predicted pose for the time you 
give it (usually the predictedDisplayTime you got from xrWaitFrame()). The 
close you get to the predicted time, the better the prediction will be. So 
it is encouraged to predict as close to draw as possible. 
By using the update slave callback, i believe you are accomplishing this 

Re: [osg-users] Osing OpenXR in OSG

2021-06-15 Thread Robert Osfield
Hi James,

I don't have a work VR headset to test against right now so can't pitch in 
with any testing or insights into OpenXR.  I did a quick look over your 
changes and can share a few thoughts about general direction.

My recommendation would be to move your code into it's own osgXR library, 
but stick with the osgViewer::ViewConfig approach as this as it should make 
it easier for developers to switch between desktop and VR configurations - 
a strength of your current implementation.

Creating a separate osgXR library will allow developers to use it against a 
wide range of OSG versions, so won't need to do any updates, just link to 
osgXR set up the viewer configuration and away they go.  This also 
decouples the XR functionality from needing to be integrated within mainly 
OSG and being released as part of an official release.  I'm spending most 
of my time on the VSG project these days so have put the OSG primarily in 
maintenance mode, so new stable releases are off the table till I get the 
VSG to 1.0 (hopefully later this year.)

Within the VSG community we've been discussing OpenXR integration as well,  
Again I see this is type of functionality that a dedicate vsgXR library 
would provide, rather than being integrated into the core VSG.  While I 
haven't personally done any work in this direction I can certainly see that 
OSG and VSG XR integration could well follow similar approaches even at the 
code level then are entirely separate.  Potentially both efforts could draw 
experience and knowledge form each other.

Cheers,
Robert.

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/e0ac3b1d-5f02-4e16-98c9-7df1c0a5d1d3n%40googlegroups.com.


Re: [osg-users] Osing OpenXR in OSG

2021-06-11 Thread James Hogan
Hi

On Monday, 24 June 2019 at 23:07:53 UTC+1 davidg...@gmail.com wrote:

> Greetings!
>
> I guess that I'm going to gripe on this subject like I did a year ago!
> I know that OpenXR is at least in Open Bata and I was wondering what 
> progress anyone has made incorporating it in OSG. 
>
> While I was in GDC I did see Khronos make some progress in this area and I 
> even got to see someone do a demo of a VR display using HTC Vive. I 
> challenged the group that worked on that and never heard from them again.
>
> I think one of the holdbacks was the interactive controls was not set yet, 
> but from my perspective, they could have worked at the visual. 
>
> I know that if I had the time and resources that I would hack this out, 
> but one of the sad drawbacks of having a job is not having the time. It 
> must be that most people still see this technology as a flash in the pan, 
> but I think it’s taking on traction.
>

I had a play with this over the last week or so (in the hopes of eventually 
getting Flightgear working in VR with OpenXR since it seems to be the 
future), and have managed to get something *extremely* minimal and half 
broken going, enough to run some of the osg demos on Linux with SteamVR's 
OpenXR runtime with an HTC vive (but not flightgear yet). I've pushed a WIP 
version (see below), in the spirit of releasing early and often, in case 
anybody here is interested in providing general feedback or helping. I'll 
be able to get back to it in a few weeks when I'll try to get more of it 
working & cleaned up:
https://github.com/amalon/OpenSceneGraph (openxr-devel branch)
https://github.com/amalon/OpenSceneGraph/commit/71f80495be5cf7c4d286a52b345fa994a09e3bb7

This is my first dive into OSG (and OpenXR), so i'm definitely open to 
suggestions for improvements or the best way to integrate it (or whether it 
should even be integrated into OSG rather than as an external plugin or 
viewer). Currently I think it should get built into OSG since OSG already 
has a concept of stereo (which currently this code doesn't interact with), 
and this approach allows some rudimentary VR support even without the app 
explicitly supporting it (though clearly app support is preferable 
especially for menus and interaction), but I am not very familiar with OSG.

Braindump below for anyone interested in the details.

Cheers
James

It is added as an osgViewer config OpenXRDisplay, which can be applied 
automatically to the View by osgViewer::Viewer using environment variables 
OSG_VR=1 and OSG_VR_UNITS_PER_METER=whatever. Some C++ abstractions of 
OpenXR are in src/osgViewer/OpenXR, which are used by 
src/osgViewer/config/OpenXRDisplay.cpp to set up the OpenXR instance, 
session, swapchains, and slave cameras for each OpenXR view (e.g. each eye 
for most HMDs, but it could be one display for handheld, or more for other 
setups), and various callbacks for updating them and draw setup / swapping. 
OpenXR provides multiple OpenGL textures to write to for each swapchain, 
and we create a swapchain for each view, and an OpenGL framebuffer object 
for each image texture in each the swapchain (i assume its faster not to 
rebind the fbo attachments). Callbacks switch between the framebuffer 
objects (like in osgopenvrviewer), and OpenXR frames are started 
automatically before first render (or on first slave camera update), and 
ended in the swap callback. The OpenXR session is created using OpenGL 
graphics binding info provided via GraphicsWindow::getXrGraphicsBinding() 
which is only implemented for X11.

Current issues:
* i haven't mirrored the image to the window yet (there's probably a nice 
OSG way to blit the view textures to the main camera?). it could perhaps 
integrate with the DisplaySettings stuff somehow to decide what should be 
mirrored.
* the application name (used for creating an XR instance which is shown on 
HMD when app is starting) isn't discovered automatically and is still set 
to "osgplanets". This can probably be discovered automatically in an OSG 
way from argv[0] with the arguments stuff... haven't quite figured how yet.
* Performance is currently terrible. CPU usage and frame times don't seem 
high, so its blocking excessively somewhere. I briefly tried modding the 
ViewerBase loop to avoid sleeping there, but haven't got to the bottom of 
it yet. OpenXR does complain about validation errors on EndFrame, but its 
unclear why & whether thats related, and it doesn't stop the images being 
displayed in the HMD.
* synchronisation isn't handled between threads, as i don't yet have a good 
grasp of how OSG uses threads to figure out exactly whats needed. Currently 
threaded rendering is disabled (like in osgopenvrviewer).
* flightgear: currently it appears to fail due to SteamVR changing GL 
context somewhere (a known bug, worked around in the swapchain 
abstraction), resulting in OSG framebuffer objects being unable to be 
created. I haven't had much time yet to figure it out. In any case it'll