On Sun, 3 Jan 2016 16:46:43 -0500 Christopher Michael <[email protected]> wrote:
> On 01/03/2016 03:51 PM, Cedric BAIL wrote: > > On Sun, Jan 3, 2016 at 9:09 PM, David Seikel <[email protected]> > > wrote: > >> I've recently come out of a contract where I was adding Oculus Rift > >> support to an existing application. This is something I always > >> wanted to play with. Having had that experience, now I want to > >> add generic HMD (Head Mounted Display, like the Rift) support to > >> Evas_3D. It will be great for my major, years long, virtual > >> worlds project. > >> > >> There's a few different HMDs being released soonish, many > >> variations on Google Cardboard have been out for a while, Samsung > >> Gear VR was released last month, Oculus Rift is likely to be > >> released in the next month or three, HTC Vive probably released > >> later this year, and many more. > >> > >> http://www.hypergridbusiness.com/faq/best-virtual-reality-headsets/ > >> lists dozens of them (mostly Cardboard variations). > >> > >> I was using an Oculus Rift DK2 (Developer Kit 2) supplied by the > >> client for this contract. I have to hand that back in about a > >> week, but I'll be buying my own Oculus Rift when it gets > >> released. I'll probably get a Google Cardboard style device as > >> well. > >> > >> The contract used the Oculus SDK, but obviously that only handles > >> Oculus HMDs, and these days only on Windows. Oculus used to > >> support Linux and Mac OS X, but they dropped that to concentrate > >> on Windows for the release. Oculus claim the other OS support > >> will return. There's a couple of open source, cross platform, > >> generic HMD libraries that are trying to handle a bunch of HMDs, > >> and that's what I'd rather be using for EFL. It's still early > >> days for this tech, so no standards or anything yet. > >> > >> http://hg.sitedethib.com/libvr > >> > >> http://openhmd.net/ > >> > >> And a conversion of the Oculus SDK (horrid bit of C++ coding, not > >> all of it open surce) into open source C - > >> > >> https://github.com/ultranbrown/libovr_nsb > >> > >> What do people think of adding generic HMD support to Evas_3D? > > > > I am guessing that what you really need to add is support for > > stereoscopic output. That would be a part of it, yes. That was the starting point of my contract, support stereoscopic output first, then morph that into a Rift display. Stereoscopic display is an easy bit. You also need to distort the result, not just be stereoscopic. The distortion remaps the resulting image so that when it eventually hits your eyes, the distortion of the HMD lenses is compensated for. Not that tricky, shaders do it easily. Basically the lenses needed to make sure ordinary people can focus on the screen that is a few mere centimetres away from your eyes, introduce barrel distortion, so the reverse pin cushion distortion has to be pre applied, so it all comes out straight in the end. Then there's also dealing with rotation and position sensors on HMDs. They do their magic by detecting where your head is at and which direction it is pointing, so the software can adjust the view into the 3D world as you move your head around. This is the very crucial part, due to VR sickness. It's crucial that this head tracking / update display thing happens with absolutely bare minimum latency. Basically the more latency in a HMD, the more likely the wearer is to throw up, and when wearing a HMD, you can't see the real world, so you are likely to throw up all over your expensive computer, missing the conveniently placed bucket entirely. So called VR sickness is actually a very crucial issue. It's similar to motion sickness, only reversed. The world is moving for your eyes, but your inner ear disagrees, which triggers a "something is terribly wrong" response in your body, which tends to make you throw up. Your body thinks that maybe the world is screwed up coz of what ever you ate last, so tries to get rid of that as a first level emergency response. So you throw up. I'm no doctor, this is how it was explained to me. There's lots of ongoing research to reduce this problem. In general though, most people can "get their VR legs" as it's called, slowly getting used to it by short initial exposures, getting longer as you feel more comfy. Motion sickness is similar, with exactly the same response. Only the world is still for your eyes, but your inner ear thinks you are moving. I don't suffer from either problem myself, but it's something you have to be aware of. The amount of times I have thrown up in my five and a half decade life can be counted on one hand, with plenty of fingers left over. VR sickness a BIIIG topic of conversation for HMD developers. Either way, the likely hood of throwing up and feeling bad is high for some people. There's heaps of discussions about how to avoid these problems. There's a medical checklist I go through when introducing new people to HMDs. Rift in particular insists on displaying a health and safety screen when you start. Yes, HMD developers can be obsessive about preventing VR sickness. Coz if you are not, people get ill, people stop using your stuff, people sue you for making them ill, people give you a bad rep, ... it's not good, best to avoid it. These considerations exist for most, if not all, HMDs. > > This is definitively one of the reason why we > > have added Evas_3D. There is still some work to be done to reach the > > point where that would work out of the box. The step I see so far > > are : > > - Negotiate with the compositor over Wayland protocol if it can > > handle a 3d stereoscopic output > > - Add detection of 3d stereoscopic output to Enlightenment and > > ecore_drm backend (If I remember correctly Intel GPU support under > > Linux 3d output over HDMI somewhere in libdrm) Intel GPUs are not supported well for HMDs, they tend to not be grunty enough. It's early days, VR sickness is a thing, so generally really high end graphics cards are needed. In fact, Rift has trouble with nVidia Optimus graphics chips that are very common in laptops. Optimus puts an Intel GPU in front of an nVidia GPU. Everything has to go through the Intel chip, coz the nVidia connects to the monitors THROUGH the Intel chip. This introduces extra latency that Oculus spit the dummy on. They no longer support Optimus, in fact they deliberately refuse to work on Optimus chips since their 0.7 SDK. 0.6 and earlier worked fine on Optimus though. I think they are just being precious. The Windows desktop supplied by my client has Optimus, and the client themselves will be using Optimus based laptops. So fuck you Oculus, they are sticking with SDK 0.6. Personally, from my experience supporting Second Life / OpenSim users professionally, most people tend to use low end student / business laptops for that, coz they are cheap and plentiful. Which tend to have a hard time with SL / OS, and will have a harder time with HMDs. SL / OS is horrible code base, I'm sure EFL based code could run much faster. One of my goals is to make sure these low powered laptops can actually get a useful display out of them. Which is sorta the exact opposite of what Oculus has done, they are being precious about making sure VR is well accepted by the world in general, so they just refuse to run on anything slow. > Correct. > > /** > * DRM_CLIENT_CAP_STEREO_3D > * > * if set to 1, the DRM core will expose the stereo 3D capabilities > of the > * monitor by advertising the supported 3D layouts in the flags of > struct > * drm_mode_modeinfo. > */ > #define DRM_CLIENT_CAP_STEREO_3D 1 > > So basically, get the mode info and check the "flags" of the mode > struct for supported layouts. Possible values on "flags": > > #define DRM_MODE_FLAG_3D_MASK (0x1f<<14) > #define DRM_MODE_FLAG_3D_NONE (0<<14) > #define DRM_MODE_FLAG_3D_FRAME_PACKING (1<<14) > #define DRM_MODE_FLAG_3D_FIELD_ALTERNATIVE (2<<14) > #define DRM_MODE_FLAG_3D_LINE_ALTERNATIVE (3<<14) > #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_FULL (4<<14) > #define DRM_MODE_FLAG_3D_L_DEPTH (5<<14) > #define DRM_MODE_FLAG_3D_L_DEPTH_GFX_GFX_DEPTH (6<<14) > #define DRM_MODE_FLAG_3D_TOP_AND_BOTTOM (7<<14) > #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_HALF (8<<14) > > This would/should actually be quite simple to detect in the ecore_drm > code as we already fetch mode info for crtcs. Would just be a matter > of checking for these flags. Not quite so correct. Sure this probably works fine under DRM / Wayland, but I mentioned Mac OS X and Windows as well as Linux. I don't think that's gonna work on these OSes. Even in Linux, non Wayland support would be needed. Frame buffer support, and the others, might be good to, but I had not considered that yet. I actually don't know what DRM is, other than it also stands for Digital Rights Management, which is what Google will show me. Is there a short "high level coder" introduction to what DRM is all about, so I can quickly get up to speed? I don't use either, but would be happy to support them, I just don't want it to be the only way. Is EFL Wayland ready for me yet DH? On the other hand, for their own reasons, Oculus in particular moved away from "just be a monitor" to "be a specialised non monitor device". A very controversial move. Oculus may not support "just be a monitor" mode in the future. Dunno about the plans of the other HMDs though. I suspect Oculus Rift might be one of the popular ones. I've not looked at the other HMDs yet to see if they might do something similar. Some HMDs in particular I know are NOT monitors. TrinusVR for example is a device on the end of a WiFi or USB connection, not a monitor. I think Google Cardboard is similar, in that the actual device is an ordinary smart phone that's not pretending to be a monitor. TrinusVR actually wraps Google Cardboard. Still, this HMD detection and configuration step is the most trivial part of the entire process. But non DRM methods will be needed as well. With so many HMDs on the market, and more coming, plus early days of this tech, there's no standards. So in the end, a manual process will be needed as well, which will have to fall back to "just be a monitor". > > - Add viewport definition and support to render evas scenegraph > > with a slight change in the camera definition for both left and > > right eyes There's also what's called an "eye buffer". The final scene is rendered to a buffer that is bigger than the resulting resolution, coz the lens compensation needs the extra info to get things correct. Some pixels around the edge get discarded, but pixels in the middle need to be high rez. > > We could also take advantage of a Z property on normal Evas_Object > > to have some kind of floating UI in 3D. Actually, one of the things recommended to help keep VR sickness at bay is to NOT have a floating UI. You should try to make the UI part of the 3D world. Still, floating UIs are gonna happen. My client in particular gets a bit ill with floating UIs that are at a fixed position relative to the HMD. He prefers the UIs to be stuck to the objects they represent. So clicking on a 3D object causes it's UI window to popup next to the object, and the window stays in a fixed position relative to that object. I worry that people might forget they have windows open on dozens of objects scattered throughout the world. lol On the other hand, research is showing that having some sort of fixed "cockpit" might help to avoid the sickness. The theory is that this provides a fixed frame of reference to help offset the rest of the world spinning wildly as your inner ear sits quietly in your office chair. So in-vehicle games are popular, there's even a popular game where you play the part of a truck driver, driving around Europe, making deliveries I think (I've not tried it, but I keep hearing about this game, sounds boring to me, that's my brother-in-laws job, and he's boring). The difference is that the cockpit surrounds you, the floating UI doesn't, otherwise this would seem to be contradictory. This sort of stuff is still being researched. In the end though, yes floating UIs will be made, but it's not encouraged. > > There are a also few tricks to handle, like the resolution changing > > between when you are turning on stereoscopic display from where you > > are not. Well yes, lots of tricks I've been learning for this contract. B-) The resolution of the HMDs don't tend to change though. Your main monitor might, and the HMD display might be mirrored to the main monitor, so others can watch what the HMD wearer is doing. Windows in particular is bad at this, the refresh rate of your main monitor has to match the rate of the HMD. Can of worms. In the near future however, nVidia at least (likely AMD as well) are adding multiple resolutions on the same screen tech. So that the pure blue sky can be rendered at a really low rez (it's all just solid blue, no need for detail), but the monster in front of you is rendered in full rez glory as it tears you apart, so you can fully appreciate each glistening drop of monster drool that drips from it's jaw while it chews on your limbs, and the reflection of your look of fear in those drool drops. Horror games are popular with HMDs. > > Anyway, that would be my idea on what to do for this, sadly I don't > > think we have any plan to move that forward at this point, but I > > will be happy to help you figure out detail on this. As I mentioned, I wont have this development Rift for much longer. I plan on pre-ordering the consumer version as soon as I can, I already have the money for it. When I get that, then I'll start working on EFL HMD support. I should have had a Google Cardboard already, but the one and only company in Australia that makes them was out of stock at the beginning of that contract, and they haven't gotten any more stock since. Pity, they are in the next city, would have been most convenient. They even tried to hire me to work on their stuff, alas they only use Unity, and I have no experience with that. I'll get a Google Cardboard sooner or later, they are cheap, so long as you already have a supported smart phone (I do). On the other hand, my reason for doing this is to support my virtual world work. Virtual world software has to simulate an entire world, there's lots and lots and lots of fiddly little details, in lots of different areas. So this means that it's a huge project, with lots and lots and lots of interesting things to implement. Gonna take a long time, but the beauty of this is that you never get bored, there's always many things you can decide to work on at any time. So if there's a reason to delay / pause HMD work, there's plenty of other stuff to keep me busy. This is my passion though, so I'll be working on it when ever I get free time. -- A big old stinking pile of genius that no one wants coz there are too many silver coated monkeys in the world.
signature.asc
Description: PGP signature
------------------------------------------------------------------------------
_______________________________________________ enlightenment-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/enlightenment-devel
