I am trying to make one of my apps, Tilt Theremin, more interesting by
making use of the gyroscope sensor. As far as I have understood, one
has to use it in combination with accelerometer and compass to get
accurate reading of pitch, roll and yaw.

I have watched this talk:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k

According to that, the only thing needed is to use the virtual sensor
of TYPE_ROTATION_VECTOR. I tried that on my Nexus S, and it is very
sloppy and seems heavily lowpass filtered. I read on some other post
that TYPE_ROTATION_VECTOR does not currently use the gyroscope, at
least not on the Nexus S and Motorola Xoom.

So here are the questions:

a) Will the virtual sensor TYPE_ROTATION_VECTOR be updated to use data
from gyroscope, and will I be able to use that update on my Nexus S?
Or is it some limitation in hardware? David Sachs in the video talks
about the fusion algorithms being done in hardware, so maybe mine has
no support for that?

b) Is there som pseudo- or real code out there, which can show me how
to do it in software here and now? I have (appearently) managed to
implement a Complementary Filter, but that does not take the compass
into account (only acc and gyro), and I seem to get much of the same
data as I did using just the accelerometer. I can't find any code on
the net which takes gyro, accelerometer and compass into account to
get similar results as in the video.

I see there has been opened an issue on this, but there is not much
activity:
http://code.google.com/p/android/issues/detail?id=17780

Please help me with this! It would be nice to make developing apps
using the gyroscope (more) feasible on the Android platform!

Thanks

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to