On Sun, Jan 3, 2016 at 8:47 PM, David Seikel <[email protected]> wrote: > Cedric's gonna hate me, all these long emails.
Yes, I do ! Even more now that I have spend an hour talking with our engineer here that work on Gear VR to understand how things work... and it is a mess ! > On Sun, 3 Jan 2016 22:19:53 -0500 Christopher Michael > <[email protected]> wrote: >> On 01/03/2016 08:11 PM, David Seikel wrote: >> > On Sun, 3 Jan 2016 16:46:43 -0500 Christopher Michael >> > <[email protected]> wrote: >> >> On 01/03/2016 03:51 PM, Cedric BAIL wrote: >> >>> On Sun, Jan 3, 2016 at 9:09 PM, David Seikel <[email protected]> >> >>> wrote: >> >>>> I've recently come out of a contract where I was adding Oculus >> >>>> Rift support to an existing application. This is something I >> >>>> always wanted to play with. Having had that experience, now I >> >>>> want to add generic HMD (Head Mounted Display, like the Rift) >> >>>> support to Evas_3D. It will be great for my major, years long, >> >>>> virtual worlds project. >> >>>> >> >>>> There's a few different HMDs being released soonish, many >> >>>> variations on Google Cardboard have been out for a while, Samsung >> >>>> Gear VR was released last month, Oculus Rift is likely to be >> >>>> released in the next month or three, HTC Vive probably released >> >>>> later this year, and many more. >> >>>> >> >>>> http://www.hypergridbusiness.com/faq/best-virtual-reality-headsets/ >> >>>> lists dozens of them (mostly Cardboard variations). >> >>>> >> >>>> I was using an Oculus Rift DK2 (Developer Kit 2) supplied by the >> >>>> client for this contract. I have to hand that back in about a >> >>>> week, but I'll be buying my own Oculus Rift when it gets >> >>>> released. I'll probably get a Google Cardboard style device as >> >>>> well. >> >>>> >> >>>> The contract used the Oculus SDK, but obviously that only handles >> >>>> Oculus HMDs, and these days only on Windows. Oculus used to >> >>>> support Linux and Mac OS X, but they dropped that to concentrate >> >>>> on Windows for the release. Oculus claim the other OS support >> >>>> will return. There's a couple of open source, cross platform, >> >>>> generic HMD libraries that are trying to handle a bunch of HMDs, >> >>>> and that's what I'd rather be using for EFL. It's still early >> >>>> days for this tech, so no standards or anything yet. >> >>>> >> >>>> http://hg.sitedethib.com/libvr >> >>>> >> >>>> http://openhmd.net/ >> >>>> >> >>>> And a conversion of the Oculus SDK (horrid bit of C++ coding, not >> >>>> all of it open surce) into open source C - >> >>>> >> >>>> https://github.com/ultranbrown/libovr_nsb >> >>>> >> >>>> What do people think of adding generic HMD support to Evas_3D? >> >>> >> >>> I am guessing that what you really need to add is support for >> >>> stereoscopic output. >> > >> > That would be a part of it, yes. That was the starting point of my >> > contract, support stereoscopic output first, then morph that into a >> > Rift display. Stereoscopic display is an easy bit. >> > >> > You also need to distort the result, not just be stereoscopic. The >> > distortion remaps the resulting image so that when it eventually >> > hits your eyes, the distortion of the HMD lenses is compensated >> > for. Not that tricky, shaders do it easily. Basically the lenses >> > needed to make sure ordinary people can focus on the screen that is >> > a few mere centimetres away from your eyes, introduce barrel >> > distortion, so the reverse pin cushion distortion has to be pre >> > applied, so it all comes out straight in the end. >> > >> > Then there's also dealing with rotation and position sensors on >> > HMDs. They do their magic by detecting where your head is at and >> > which direction it is pointing, so the software can adjust the view >> > into the 3D world as you move your head around. This is the very >> > crucial part, due to VR sickness. >> > >> > It's crucial that this head tracking / update display thing happens >> > with absolutely bare minimum latency. Basically the more latency >> > in a HMD, the more likely the wearer is to throw up, and when >> > wearing a HMD, you can't see the real world, so you are likely to >> > throw up all over your expensive computer, missing the conveniently >> > placed bucket entirely. >> > >> > So called VR sickness is actually a very crucial issue. It's >> > similar to motion sickness, only reversed. The world is moving for >> > your eyes, but your inner ear disagrees, which triggers a >> > "something is terribly wrong" response in your body, which tends to >> > make you throw up. Your body thinks that maybe the world is >> > screwed up coz of what ever you ate last, so tries to get rid of >> > that as a first level emergency response. So you throw up. I'm no >> > doctor, this is how it was explained to me. There's lots of >> > ongoing research to reduce this problem. In general though, most >> > people can "get their VR legs" as it's called, slowly getting used >> > to it by short initial exposures, getting longer as you feel more >> > comfy. >> > >> > Motion sickness is similar, with exactly the same response. Only >> > the world is still for your eyes, but your inner ear thinks you are >> > moving. >> > >> > I don't suffer from either problem myself, but it's something you >> > have to be aware of. The amount of times I have thrown up in my >> > five and a half decade life can be counted on one hand, with plenty >> > of fingers left over. VR sickness a BIIIG topic of conversation >> > for HMD developers. >> > >> > Either way, the likely hood of throwing up and feeling bad is high >> > for some people. There's heaps of discussions about how to avoid >> > these problems. There's a medical checklist I go through when >> > introducing new people to HMDs. Rift in particular insists on >> > displaying a health and safety screen when you start. >> > >> > Yes, HMD developers can be obsessive about preventing VR sickness. >> > Coz if you are not, people get ill, people stop using your stuff, >> > people sue you for making them ill, people give you a bad rep, ... >> > it's not good, best to avoid it. >> > >> > These considerations exist for most, if not all, HMDs. >> > >> >>> This is definitively one of the reason why we >> >>> have added Evas_3D. There is still some work to be done to reach >> >>> the point where that would work out of the box. The step I see so >> >>> far are : >> >>> - Negotiate with the compositor over Wayland protocol if it can >> >>> handle a 3d stereoscopic output >> >>> - Add detection of 3d stereoscopic output to Enlightenment and >> >>> ecore_drm backend (If I remember correctly Intel GPU support under >> >>> Linux 3d output over HDMI somewhere in libdrm) >> > >> > Intel GPUs are not supported well for HMDs, they tend to not be >> > grunty enough. It's early days, VR sickness is a thing, so >> > generally really high end graphics cards are needed. In fact, Rift >> > has trouble with nVidia Optimus graphics chips that are very common >> > in laptops. Optimus puts an Intel GPU in front of an nVidia GPU. >> > Everything has to go through the Intel chip, coz the nVidia >> > connects to the monitors THROUGH the Intel chip. This introduces >> > extra latency that Oculus spit the dummy on. They no longer >> > support Optimus, in fact they deliberately refuse to work on >> > Optimus chips since their 0.7 SDK. 0.6 and earlier worked fine on >> > Optimus though. I think they are just being precious. The Windows >> > desktop supplied by my client has Optimus, and the client >> > themselves will be using Optimus based laptops. So fuck you >> > Oculus, they are sticking with SDK 0.6. This doesn't surprise me to much as the added latency will be an issue as you pointed out above with motion sickness. >> > Personally, from my experience supporting Second Life / OpenSim >> > users professionally, most people tend to use low end student / >> > business laptops for that, coz they are cheap and plentiful. Which >> > tend to have a hard time with SL / OS, and will have a harder time >> > with HMDs. SL / OS is horrible code base, I'm sure EFL based code >> > could run much faster. One of my goals is to make sure these low >> > powered laptops can actually get a useful display out of them. >> > Which is sorta the exact opposite of what Oculus has done, they are >> > being precious about making sure VR is well accepted by the world >> > in general, so they just refuse to run on anything slow. >> > >> >> Correct. >> >> >> >> /** >> >> * DRM_CLIENT_CAP_STEREO_3D >> >> * >> >> * if set to 1, the DRM core will expose the stereo 3D >> >> capabilities of the >> >> * monitor by advertising the supported 3D layouts in the flags >> >> of struct >> >> * drm_mode_modeinfo. >> >> */ >> >> #define DRM_CLIENT_CAP_STEREO_3D 1 >> >> >> >> So basically, get the mode info and check the "flags" of the mode >> >> struct for supported layouts. Possible values on "flags": >> >> >> >> #define DRM_MODE_FLAG_3D_MASK (0x1f<<14) >> >> #define DRM_MODE_FLAG_3D_NONE (0<<14) >> >> #define DRM_MODE_FLAG_3D_FRAME_PACKING (1<<14) >> >> #define DRM_MODE_FLAG_3D_FIELD_ALTERNATIVE (2<<14) >> >> #define DRM_MODE_FLAG_3D_LINE_ALTERNATIVE (3<<14) >> >> #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_FULL (4<<14) >> >> #define DRM_MODE_FLAG_3D_L_DEPTH (5<<14) >> >> #define DRM_MODE_FLAG_3D_L_DEPTH_GFX_GFX_DEPTH (6<<14) >> >> #define DRM_MODE_FLAG_3D_TOP_AND_BOTTOM (7<<14) >> >> #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_HALF (8<<14) >> >> >> >> This would/should actually be quite simple to detect in the >> >> ecore_drm code as we already fetch mode info for crtcs. Would just >> >> be a matter of checking for these flags. >> > >> > Not quite so correct. Sure this probably works fine under DRM / >> > Wayland, but I mentioned Mac OS X and Windows as well as Linux. >> >> Sure. I defer to your wisdom in those areas ;) I was mearly stating >> that (as far as code goes), it would be fairly simple (wrt libdrm) to >> add these pieces to ecore_drm. > > Yep, looks simple enough. Still wont be useful for Mac and Windows > support, but at least that's Linux catered for in the "detect a HMD > that wants to be a monitor" stakes. As I pointed out, not all HMDs > present themselves as monitors. Or Android. Also... it is not that useful on Linux as there is no support for HMD at the moment and Occulus Rift doesn't seems to interested by that. Also the situation is more complex that I think it would be. There is actually 2 differents content that are streamed to an HMD. 360 view (either spherical or cylinder mapped) and stereoscopic view. Of course, some HMD support one or the other or both, enjoy ! But that's only the beginning of the trouble, each HMD has different lense which means they need different transformation (and I have not gotten the confirmation that you could write one filter that cover every HMD). Now, my understanding is that there is also no common API on Windows or Mac at this stage, but each HMD provide their own SDK. Of course on Linux situation is even worse as there is not even a SDK. I am not to sure of the course of action that should be taken as making Occulus Rift or Gear VR work on Linux is going to be a massive under taking. I am thinking that the most easy target for development would be Mac as we are close to have 100% EFL working there already and you would add a new backend there for each HMD (at least at this stage). > In the end though, I expect the existing generic HMD libraries deal > with that for us, if they don't, I can send patches. That's where HMD > device detection and configuration should live. Does that already exist ? If it does, that should definitively be the layer we work on top of. >> > I don't think that's gonna work on these OSes. Even in Linux, non >> > Wayland support would be needed. Frame buffer support, and the >> > others, might be good to, but I had not considered that yet. >> > >> >> Well, utilizing libdrm, it Should be Display Server independant. >> Frame buffer (ie: kernel mode setting) is supporting nicely via libdrm > > Cool. Still there is nothing close to that in X11 and Wayland is the most likely easy path for supporting such a work. And when I say easy, it is going to be a massive amount of work and discussion to get a new protocol landed there. libdrm/ecore_drm will allow quick prototype for application that work in framebuffer first, if you manage to get a HMD working on Linux. Then you can play with Enlightenment/Wayland and get a desktop stack working. >> Direct Rendering Manager >> (https://en.wikipedia.org/wiki/Direct_Rendering_Manager, >> http://dri.freedesktop.org/wiki/DRM/, >> http://events.linuxfoundation.org/sites/events/files/slides/brezillon-drm-kms.pdf, >> https://www.bitwiz.org.uk/s/how-dri-and-drm-work.html) >> >> (just a couple o links ... use your google-ness) ;) >> >> Think "kernel mode setting and frame buffer management" .. (ie: >> pushing pixels to the console) ;) > > So basically, it's a low level graphics API that I probably do use, > since it's at the base of X and the other things? The base, yes, but going above it is going to be a absolutely massive amount of work. > Perhaps this is one reason why Oculus "temporarily" dropped Linux > support, if DRM wont let us use nVidia binary blobs? That and the massive amount of work to be done in X and the whole stack itself. I do think that Wayland support will be easier to reach. Have fun, -- Cedric BAIL ------------------------------------------------------------------------------ _______________________________________________ enlightenment-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/enlightenment-devel
