Hello Simon, On 11/11/2010 04:37 AM, ho...@uni-koeln.de wrote: > 1. What type of coordinatesystem (lefthand/righthand) does OSG use for > the trackingdata?
OpenSG uses right handed coordinate systems throughout, just like OpenGL. > 2. What are the trackingdata's units expected by OSG? We get millimeter > from the trackingsystem. since the system does not really know anything about head tracking the expected units are whatever you define them to be. Of course ultimately everything (tracking data, screen description, model) have to use the same units, so you get to pick the units of one of these and then need to express everything else in the same units. > 3. What is the best practice to manipulate the trackingdata to fit OSG? > What is the best way to get the correct matrix for the manipulation? > > Questions, long version: > > We are using an ART-trackingsystem. It provides its data in mm in a > right-hand coordinate system. First thing I did was setting up the > ProjectionCameraDecorators and the Tracking, feeding the data without > any manipulation. > First problem: Nothing on the screen. First trial-and-error solution: > All trackingdata has to be scaled by /100 (screencorners and tracking, > see code snippet). It works, but I would still like to know what units > are really expected by OSG. see above. > Next problem: The scene scaled the wrong way. While getting closer to > the screen, the scene got smaller, directly at the screen it became a dot. > > For every corner of the screen, I provided the coordinates in > coordinates of the tracking system (via the decorators' > push_back()-memberfunction). For the value of the z-coordinate right at > the screen, it seemed like the scene would be scaled to absolute minimum. > After some trial-and-error again, I came to the following solution: > For the screen's corners z-coordinates I set a value that is located 2 > meters behind the actual screen (-3000mm instead of actual -835mm) plus > I had to invert the tracked z-data (see code snippet). > What is the idea behind the data set in the decorators' > push_back()-memberfunction? And how is OSG's coordinatesystem for > trackingdata oriented? hm, I've never used OpenSG's head tracking, so I may be off target here. It seems that the way the decorator works is that it computes the view frustum from the user position (provided by the tracking system) and the screen. Now if you move the user very close to the screen the frustum gets very wide (and you see a lot of the scene), but your physical projection surface stays the same size. Now that means a lot more scene has to fit onto the same are, hence everything gets smaller. I'm not saying this is necessary a terribly useful thing or intuitive, but it is an effect of that particular head tracking setup with a fixed projection surface and the eye point moving on one side of it. In order to keep things from degenerating too badly you probably have to keep the user at least camera.getNear() from the surface - possibly by pretending the screen is further away as you are doing. I'm not sure about the inverted z coordinate, that could perhaps be a bug in the computation somewhere... > Code snippets (pretty ugly, but cleaning up is done at the end ;) ): > [SNIP] > // Process Transformation > > Vec3d rotationX; > Vec3d rotationY; > Vec3d rotationZ; > > Quaternion userRotation; > trackedBody = dt->getBody(i); > > rotationX.setValues(trackedBody.getRotColRow_1_1(), > trackedBody.getRotColRow_1_2(), > trackedBody.getRotColRow_1_3()); > rotationY.setValues(trackedBody.getRotColRow_2_1(), > trackedBody.getRotColRow_2_2(), > trackedBody.getRotColRow_2_3()); > rotationZ.setValues(trackedBody.getRotColRow_3_1(), > trackedBody.getRotColRow_3_2(), > trackedBody.getRotColRow_3_3()); > > > userRotation.setValue(Matrix(rotationX,rotationY,rotationZ)); > userTrackingMatrix.setIdentity(); > userTrackingMatrix.setRotate(userRotation); > userTrackingMatrix.setTranslate(x, y, z); > //userTrackingMatrix.setScale(sceneTransform); this last one needed to be commented, because setTranslate(), setRotate(), setScale() directly write the related matrix entries. If you want to construct a transform that consists of a translation, rotation and scale, use setTransform(translate, rotate, scale). Or construct individual matrices for each and then multiply them together, that way you can also change the order of the transformations. > beginEditCP(userTrackingTransform, > osg::Transform::MatrixFieldMask); > > userTrackingTransform->setMatrix(userTrackingMatrix); > endEditCP(userTrackingTransform, > osg::Transform::MatrixFieldMask); > (...) I see you have your own helper macro for the beginEditCP/endEditCP stuff, you can also use the CPEdit and CPEditAll macros: { CPEdit(someNode, OSG::Node::ChildrenFieldMask); someNode->addChild(someOtherNode); } CPEditAll does not need a mask as second argument. Cheers, Carsten PS: OSG is the abbreviation used by OpenSceneGraph, since the naming of the two projects is already confusing enough we usually use OpenSG to refer to the project ;) ------------------------------------------------------------------------------ Centralized Desktop Delivery: Dell and VMware Reference Architecture Simplifying enterprise desktop deployment and management using Dell EqualLogic storage and VMware View: A highly scalable, end-to-end client virtualization framework. Read more! http://p.sf.net/sfu/dell-eql-dev2dev _______________________________________________ Opensg-users mailing list Opensg-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/opensg-users