Great news! Will this in some way addres an aproach to allow Eevee to export stereo panoramas/cubes with screen space effects support?
In raster renders this is usually accomplished by composing a lot of thin render slices. That can be done by an addon, but I think it would be more optmimizad in blender code. This can help: https://developers.google.com/vr/jump/rendering-ods-content.pdf *Adriano A. Oliveira* <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail> Livre de vírus. www.avast.com <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>. <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2> Em ter, 29 de out de 2019 às 08:21, Julian Eisel <[email protected]> escreveu: > Hi all, > > we had a VR meeting during the conference, here are the notable bits. > > Attendees > ======== > * Dalai Felinto (Blender Foundation) > * Damien Coureau (Ubisoft) > * Daniel Martinez Lara (Pepe School Land & MPX) > * Julian Eisel (Blender Institute) > * Julien Blervaque (Ubisoft) > * Sebastian König (blendFX) > * Simeon Conzendorf (blendFX) > * William Reynish (Blender Foundation) > > General Requirements > ================= > * There will have to be some experimenting with different approaches > to XR UIs. People and studios also have very different needs for XR > experiences. > * Just how we define the regular 2D UIs in Python, XR UIs should be > defined in Python as well. That allows extending and specializing the > UIs for custom needs. The Ubisoft team is especially keen on this. > * Blender should bundle a good default XR UI for common usage, > established through experimentation of the core VR team and other > collaborators. > * With the gizmo, operator and drawing APIs, the BPY already has many > of the needed bits. There's still lots of stuff we'd have to figure > out and add to it though. > > Next Steps > ======== > * The team agrees on building the VR UI around specific use-cases. > * The GSoC patch [1] should be merged with the first basic use-case > working (scene inspection, see below) > * We'll start with the following use-cases (roughly in that order): > 1. Scene Introspection - VR viewport with initial/simple navigation > 2. Sculpting & Grease Pencil drawing in VR > 3. Set arrangement/layout - Support adding primitives, transforming > objects and some related gizmos > 4. Complete immersive toolset (aka MARUI) > 5. VR Storyboarding - Support changing cameras, time and animation > timing within the VR session > 6. Set dressing - Support adding particles, assets, ... > * Besides the first use-case, the involved "Blender core developers" > will only try to provide the frameworks needed to implement the other > use-cases. These can then be picked up by contributors (e.g. the > Ubisoft, MARUI or MPX teams). > * The project will be organized as usual in Blender, with a landing > page on developer.blender.org, clearly stated priorities and > milestones, and visible ways for contributors to get involved. > > Future > ===== > * We also want to support AR/MR based use-cases. > * Collaborative sessions form other important use-cases: multiple > people with multiple headsets work together in the same XR viewport. > This is a rather complicated use-case to get supported, but it would > be immensely useful. > > > [1] - https://developer.blender.org/D5537 > > Cheers, > - Julian - > _______________________________________________ > Bf-committers mailing list > [email protected] > https://lists.blender.org/mailman/listinfo/bf-committers > _______________________________________________ Bf-committers mailing list [email protected] https://lists.blender.org/mailman/listinfo/bf-committers
