Hi

On Monday, 24 June 2019 at 23:07:53 UTC+1 davidg...@gmail.com wrote:

> Greetings!
>
> I guess that I'm going to gripe on this subject like I did a year ago!
> I know that OpenXR is at least in Open Bata and I was wondering what 
> progress anyone has made incorporating it in OSG. 
>
> While I was in GDC I did see Khronos make some progress in this area and I 
> even got to see someone do a demo of a VR display using HTC Vive. I 
> challenged the group that worked on that and never heard from them again.
>
> I think one of the holdbacks was the interactive controls was not set yet, 
> but from my perspective, they could have worked at the visual. 
>
> I know that if I had the time and resources that I would hack this out, 
> but one of the sad drawbacks of having a job is not having the time. It 
> must be that most people still see this technology as a flash in the pan, 
> but I think it’s taking on traction.
>

I had a play with this over the last week or so (in the hopes of eventually 
getting Flightgear working in VR with OpenXR since it seems to be the 
future), and have managed to get something *extremely* minimal and half 
broken going, enough to run some of the osg demos on Linux with SteamVR's 
OpenXR runtime with an HTC vive (but not flightgear yet). I've pushed a WIP 
version (see below), in the spirit of releasing early and often, in case 
anybody here is interested in providing general feedback or helping. I'll 
be able to get back to it in a few weeks when I'll try to get more of it 
working & cleaned up:
https://github.com/amalon/OpenSceneGraph (openxr-devel branch)
https://github.com/amalon/OpenSceneGraph/commit/71f80495be5cf7c4d286a52b345fa994a09e3bb7

This is my first dive into OSG (and OpenXR), so i'm definitely open to 
suggestions for improvements or the best way to integrate it (or whether it 
should even be integrated into OSG rather than as an external plugin or 
viewer). Currently I think it should get built into OSG since OSG already 
has a concept of stereo (which currently this code doesn't interact with), 
and this approach allows some rudimentary VR support even without the app 
explicitly supporting it (though clearly app support is preferable 
especially for menus and interaction), but I am not very familiar with OSG.

Braindump below for anyone interested in the details.

Cheers
James

It is added as an osgViewer config OpenXRDisplay, which can be applied 
automatically to the View by osgViewer::Viewer using environment variables 
OSG_VR=1 and OSG_VR_UNITS_PER_METER=whatever. Some C++ abstractions of 
OpenXR are in src/osgViewer/OpenXR, which are used by 
src/osgViewer/config/OpenXRDisplay.cpp to set up the OpenXR instance, 
session, swapchains, and slave cameras for each OpenXR view (e.g. each eye 
for most HMDs, but it could be one display for handheld, or more for other 
setups), and various callbacks for updating them and draw setup / swapping. 
OpenXR provides multiple OpenGL textures to write to for each swapchain, 
and we create a swapchain for each view, and an OpenGL framebuffer object 
for each image texture in each the swapchain (i assume its faster not to 
rebind the fbo attachments). Callbacks switch between the framebuffer 
objects (like in osgopenvrviewer), and OpenXR frames are started 
automatically before first render (or on first slave camera update), and 
ended in the swap callback. The OpenXR session is created using OpenGL 
graphics binding info provided via GraphicsWindow::getXrGraphicsBinding() 
which is only implemented for X11.

Current issues:
* i haven't mirrored the image to the window yet (there's probably a nice 
OSG way to blit the view textures to the main camera?). it could perhaps 
integrate with the DisplaySettings stuff somehow to decide what should be 
mirrored.
* the application name (used for creating an XR instance which is shown on 
HMD when app is starting) isn't discovered automatically and is still set 
to "osgplanets". This can probably be discovered automatically in an OSG 
way from argv[0] with the arguments stuff... haven't quite figured how yet.
* Performance is currently terrible. CPU usage and frame times don't seem 
high, so its blocking excessively somewhere. I briefly tried modding the 
ViewerBase loop to avoid sleeping there, but haven't got to the bottom of 
it yet. OpenXR does complain about validation errors on EndFrame, but its 
unclear why & whether thats related, and it doesn't stop the images being 
displayed in the HMD.
* synchronisation isn't handled between threads, as i don't yet have a good 
grasp of how OSG uses threads to figure out exactly whats needed. Currently 
threaded rendering is disabled (like in osgopenvrviewer).
* flightgear: currently it appears to fail due to SteamVR changing GL 
context somewhere (a known bug, worked around in the swapchain 
abstraction), resulting in OSG framebuffer objects being unable to be 
created. I haven't had much time yet to figure it out. In any case it'll 
need a fair bit of more custom setup eventually.
* there's a couple of places where I expect it may not build without 
OpenXR. Fully intend to fix.

Other thoughts about integration into OSG:
* The custom projection matrix calculation based on 
fov{left,right,top,bottom} could be moved into osg::Matrix to join the 
other projection matrix functions there.
* Maybe controller inputs could get exposed to OSG applications via 
osg::Device events, though they're abstracted by OpenXR into app specific 
actions.
* I wonder if OpenXR composition layers (which with extensions can be 
composited as cubemaps, quads, part cylinders like curved TV, part spheres, 
as well as the usual eye projections) should be represented as some weird 
windowing system api with GraphicsWindows etc (though still tied to the 
underlying window manager one), to allow for easier rendering of 2d 
interfaces in VR...
* Advancement is ideally driven by the expected display times of individual 
frames, i.e. the next frame should show the scene at exactly the moment 
when it is expected to be displayed to the user to avoid jitter and nausia. 
This may well be more of an app level concern (certainly is for flightgear 
which AFAICT currently uses fixed 120Hz simulation steps), but a general 
VR-specific viewer mainloop is probably needed in any case.
* Choosing pixel format from the list provided by OpenXR runtime... i 
haven't looked into how OSG picks for non-VR yet.
* Each frame, the environment blend mode can be chosen from a set of 
supported ones, and the frame may be rendered differently depending on it, 
i.e. opaque (VR), alpha blended with camera (AR), or additive blending with 
camera/background (for some AR displays). Need to figure out where that 
should be decided, and whether to expose that in some rendering state 
somewhere.

Other misc development todo:
* use depth info extension to provide runtime with depth info for better 
reprojection by runtime in event of missed deadlines
* use visibility mask extension to reduce rendering to part of screen 
visible to eyes
* haven't looked into multisampling properly yet
* internals of OpenXRDisplay.cpp need splitting out into multiple files

-- 
You received this message because you are subscribed to the Google Groups 
"OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to osg-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/osg-users/f3565bcc-f744-4894-b569-da19833bfa9bn%40googlegroups.com.

Reply via email to