My 2 cents to the topic (loose thought and presentation of my AddOn for setting 
Stereo Base):

On last Blender Conference in Amsterdam I made a speech about stereoscopy and 
the way I'm creating stereo stuff in blender now:
http://www.youtube.com/watch?v=WD7xzwxhhVU
I was working on several stereo animations, some of them to be displayed on 
52-inch monitors and some (three of them actually) for cinema screens.
No live action was involved in those projects, so my task was a lot easier as I 
had full control over the final result.
I'm really glad that so much effort is put into the matter of stereoscopy 
implementation in Blender.
Having read all of the posts in this thread and the wiki entry I'd like to 
share some of my loose thoughts:

1.
>From user's perspective ability to view stereo when working would be great.

2.
WYSIWYG approach in this case is IMHO not good. What I mean by that is:
User should have ability to make some post adjustments. It's obvious that 
correcting depth bracket in post is practically impossible, but post shifting 
should be made possible. This allows to adjust depth positioning.
In my workflow I always render both views using parallel cameras. My goal is to 
use off-axis approach, but I want to have freedom in post when it comes to 
shifting images (setting depth positioning).
Therefore I always render wider images than I need. This allows me to shift 
them without losing the edges.
Adding those spare pixels after setting everything up is at the moment not 
possible with one click.
We need to change the render resolution. Making the image wider means that the 
aspect ratio changes. When we manually widen the image (increase X resolution) 
- our view doesn't get wider, but instead it gets lower, so we lose upper and 
lower parts of what we saw in our frame before. The workaround is to adjust the 
focal length accordingly. The math behind it is not that difficult, so it can 
be done by users who know the stuff, but it would be great to have an option 
like: "Add 40 pixels on each side" in render settings when working with stereo. 
If camera shift is possible - widening the image shouldn't IMHO be much of a 
problem.

3.
"left", "right" and "center" cameras:
Most of the rigs I came across use the approach of having "center" camera and 
left/right cameras are accordingly apart from "center". This seems logical, but 
I agree with Adriano about cons of such approach. (below the quote of Adriano's 
post).
It often happens that we need to "convert" some of our animations to stereo and 
want to use our existing renders as "left" or "right" view and simply set and 
render the other camera. Option for using "main" camera as left or right should 
be definitely added.
This of-course means that the pivot of our rig is not in the middle (between 
the cameras), but my experience shows that it's not a problem in 99.9% of cases.
The additional advantage of having "left" or "right" camera at the pivot of the 
rig is that when we happen to make a mistake in setting Stereo Base - it can be 
corrected and only one view needs to be re-rendered, not both of them.

On 4 kwi 2013, at 00:59, Adriano <[email protected]> wrote:
> Sugestion:
> 
> It would be nice if we can manage to set an existing camera to be left or
> right, and don't me moved at all when we setup planes in stereoscopy.
> 
> This would be very usefull to convert old project to 3d, so we can keep old
> renders as left or right and just render one new camera.
> 
> If the addon turns old camera into "center", this is not possible and we
> have to render every thing allover again.



4.
Interocular Distance / Stereo Base:
Lot of effort is put to create user friendly UI, implementing preview 
capabilities, making rendering easier etc. As I see - the whole idea is very 
well thought over and everybody here tries to make this consistent, compatible 
with the standards and easy to use.
I really appreciate this.
@ Dalai Felinto: In the wiki entry you say as follows:
>>>"Later we should be able to expand this functionalities to let the users to 
>>>work with pixel separation, instead of directly with interocular/plane 
>>>distances"<<<
Here I could offer my help. I have some experience in creating AddOns and some 
time ago I created the tool for my own use. I didn't publish it though as I 
thing it's not yet "elegant" enough to be seriously considered for 
implementation in Blender.
However now, when I see a serious work going on in this field, I'd like to 
share my code and explain how it works, so that everybody knows my way of 
thinking here. (I made a brief presentation of this in action at Amsterdam's 
conference. It's the last part of my presentation).
The AddOn is designed for off-axis approach and shifting needs to be done after 
rendering. However only small changes are needed to make shifting before 
rendering possible.
The way it works is as follows:
We have the scene with a single camera. We select this camera, hit Shift-Alt-T 
and the stereo setup is created.
Our initially selected camera becomes the "left" camera. Second camera (right) 
is created. It's parented to "left" (left camera is the pivot of the rig). 
Camera data of "right" are set such that focal length and sensor size of "left" 
and "right" are linked together. Cameras are parallel. Shift for "right" camera 
is applied such that the "near plane" is placed on screen. This shift is not 
treated as the target shift, it only set's the starting point for shifting in 
post-production. All transform properties of "right" are locked. location.x of 
"right" is driven and take all of the needed data into account to create a 
"proper" Stereo Base.
Two additional objects are added - planes representing "near plane" and "far 
plane". User needs to position them such that near plane is placed at the 
distance of the nearest object in the scene and far plane at the distance of 
the furthest object in the scene.
Driver of "right" camera's location.x takes locations of those planes into 
account and sets stereo base such that pixel separation doesn't exceed the 
value specified by the user.
"right" camera has custom properties assigned to it. First of them is "Depth 
Bracket" and here the user specifies desired depth bracket in pixels. "FP 
Influence" is a parameter that specifies the influence of "Far Plane" on the 
location.x of the right camera. By default it's set to 1.0 (full influence), 
but the user can set it to 0.0 and then everything is calculated as if the 
furthest object is in infinity. Third property "FP/NP Minimum Ratio" prevents 
from setting too big stereo base when the volume of the scene is shallow 
(distance between FP and NP is low).
Rendering needs to be set up manually. No additional scenes are created, We 
need to simply manually render everything using "left" camera and then using 
the "right" camera.

Link to the AddOn: https://dl.dropbox.com/u/18831655/object_stereo_setup.zip

All of the calculations used in this AddOn were made by me from scratch. I 
tested them and I'm sure about them, however I can't link to any documents that 
could prove it. That's simply because I was not basing on any existing 
formulas, I made my own calculations basing on my own notes and sketches.


5.
This is just a small comment to the document the Sean Olson referred to and 
Dalai commented (quotes below)
In Slide 19 I found a huge mistake. It states as follows:

>>>"When two views of an object are identical it tells your brain that they are 
>>>at infinity, at least in terms of convergence and stereo disparity."<<< 

WRONG! When two view of an object are identical it tells your brain that they 
are ON THE SCREEN.
When positive parallax of an object equals the distance between viewer's eyes - 
they appear at infinity.

On 3 kwi 2013, at 07:12, Dalai Felinto <[email protected]> wrote:
> Interesting, thanks for the link Sean. Slide 19 illustrates very well a
> problem I was trying to argue with Ton the other day. Basically UI on top
> of the stereo-3d view (mis)leads to confusing depth cues.

referring to:
> http://media.steampowered.com/apps/valve/2013/Team_Fortress_in_VR_GDC.pdf



With Respect

Bartek Skorupa

www.bartekskorupa.com


_______________________________________________
Bf-committers mailing list
[email protected]
http://lists.blender.org/mailman/listinfo/bf-committers

Reply via email to