Chase wrote> "You have to put five or more fingers down on a trackpad to
start sending events that aren't caught by Unity or X synaptics."

My understanding now: Unity is a window manager (more or less.) It is
also in the so-called stack that processes touch events.  It does
gesture recognition and hides touch events from an application (possibly
converting sequences of touch events, i.e. gestures, to other event
types) unless five touches are used.

Is there any way to disable unity gesture recognition (so that all touch
events go to the application), similarly to the way you can disable
gesture recognition in the synaptics Xorg input driver?

The stack:   kernel device driver (e.g. wacom) > Xorg input driver (e.g.
synaptics) > window manager (e.g. unity) > GUI toolkit (e.g. Qt) >
application

Shouldn't there be a set of protocols for querying and disabling gesture
recognition throughout the entire stack?  An API in the GUI TK that an
app uses to configure touch?  I suppose that is what you architecting
now. I understand it is not easy to do, in terms of coordinating all the
software pieces and also in terms of providing a consistent user
interface across platforms.

-- 
You received this bug notification because you are a member of Ubuntu-X,
which is subscribed to xf86-input-wacom in Ubuntu.
https://bugs.launchpad.net/bugs/901630

Title:
  multitouch + qt doesn't work (eg fingerpaint demo) with wacom serial
  touchscreen

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/xf86-input-wacom/+bug/901630/+subscriptions

_______________________________________________
Mailing list: https://launchpad.net/~ubuntu-x-swat
Post to     : ubuntu-x-swat@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-x-swat
More help   : https://help.launchpad.net/ListHelp

Reply via email to