Could somebody (Mathias Agopian alias pixelflinger perhaps) please
tell me how to use information from sensor for augmented reality
stuff.
Sorry am not much into graphics stuff but I did try my part to figure
stuff out with help of documentation.
Didn't miss the note:
""Note: It is preferable to use getRotationMatrix() in conjunction
with remapCoordinateSystem() and getOrientation()  to compute these
values; while it may be more expensive, it is usually more accurate.
""

Tried both. With the above method roll is always negative (in radians)
of the roll returned by orientation sensor event. Not the real problem
though as these are in the same +/-90 range but negative/opposite.
One of these goes against the definition of roll in android
documentation.

I have the camera on landscape mode and did try with and without the
suggestion:
""# Using the camera (Y axis along the camera's axis) for an augmented
reality application where the rotation angles are needed :
remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); ""

Yet no luck. Got a few lessons on quaternions etc. But am sure it is
much easier than the mess am getting into.

Say, orientation sensor event returns yaw, pitch & roll in say eO[]
What are the next steps to get rotation matrix that I could use in
opengl to rotate a augmented scene overlapping the camera preview?

Thanks and regards
RS

ps:
I have used Matrix.setRotateEulerM (m, eO[0], eO[1], eO[2]); // tried
with negatives too.
I have also tried other events (eA and eM) to eO through;
SensorManager.getRotationMatrix(mR, mI, eA, eM);
and then remap as suggested etc.


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to