ok, sorry, my bad again (it's getting late :-)). I messed up hexadecimal
and binary notation here.

anyway, I'm stuck. from the code above, you can tell that packet[1] is
responsible for the greatest part of the X coordinate value. when I put
my pen in the top left corner, this packet contains a decimal 1. in the
bottom left corner, a decimal 1 again. but somewhere in the middle, it
gives me back a 2. in the top right corner, it's a 6, whereas in the
bottom right corner it's a 5.

how can one ever convert such results into decent pointer coordinates?

I'm guessing something is missing, or going wrong, at the device
initialisation. any hints from anyone?

-- 
no calibration tool
https://bugs.launchpad.net/bugs/227183
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to