On 3/06/2014 20:25 , Shawn Rutledge wrote:
On 3 June 2014 01:25, Peter Hutterer <peter.hutte...@who-t.net> wrote:
On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
Hi Peter,

I have checked the libinput implementation and, correct me if I'm wrong, I
have seen that 2 fingers click is interpreted as right click, 3 fingers
click is  interpreted as middle click and there are some special rules for
specified trackpads, like corner clicks.

there are some special rules for clickpads, specifically a click with a
finger resting on one of the software-button areas will produce a right
or middle click.

Does that mean that the other MT events are not sent to the clients? Could
it be possible to get the 2 fingers pinch gesture from a QML client for
example?

not from a touchpad, not at this point. There are some rough plans but we've
pretty much deferred them until we had the basics sorted with libinput.

Qt Quick was designed to take touch points directly and do its own
gesture interpretation.  But we know that we need to support gesture
events too, for OSX.  So it will be OK if pinching in Wayland is a
gesture event rather than two touchpoints, but we really do need to
have one or the other approach working.  It's unfortunate if a lot of
time goes by in which neither way works.  (Caveat: I've had a lot of
trouble getting a qtwayland compositor working well enough to use as
my main environment, although I'd really like to, so I'm not
up-to-date on what works and what doesn't at this moment)

Also in X11 I do not have multi-touch interaction with the trackpad on
my Thinkpad Helix.  I suppose it's because the synaptics driver is not
going to provide touch events, because it can only interpret a fixed
set of gestures.  The upside is that I can flick even in rxvt; the
downside is I can't do pinch gestures anywhere, because X11 protocol
definition is such a slow process that 7 years after the iPhone
introduced pinching, we still don't have a pinch event.  At some point
I was testing Qt Quick with the plain evdev driver with an Apple
Bluetooth touchpad, that used to provide the actual touch points.  It
was a better experience for Qt Quick and a worse one for everything
else.

the synaptics driver does support multitouch and gives you the same type of events as any MT device will (if you disable the in-driver gestures). It has done so for about 2 years now, no-one ever cared enough about it to implement the client stack so this could actually work. Here's the thing about the X protocol: it's not this magical self-aware thing, it's written by people. If no-one works on it, it doesn't change, which is pretty much why it updates so slowly.

So here's a request: write down what exactly you need, what the use-cases are, how you want it to behave, etc. That way we can actually implement something useful. It's not that we're not listening, it's more that no-one is talking until it's too late.

We do need to have a good strategy for how this stuff is going to work
better in the future.  That's one purpose for the touch & gestures
session at the upcoming Qt Contributors Summit:
https://qt-project.org/groups/qt-contributors-summit-2014/wiki/Program
although I would be glad to delve deeper into X11 and Wayland
specifics beyond that session.  It would be good if any of you who
know the details could attend.

Flicking is a weird case because Qt Quick does its own physics: the
flicking continues after you release your finger, and there is the
bounce-back at the end.  On Apple platforms the QtQuick behavior
doesn't match the native one, so there are discussions about how to
fix that.  Are you thinking that on wayland the flicking should be
driven by extra events beyond the actual finger release, which keep
driving the UI to the end and then sending reversed events to generate
the bounce-back?  I think the main reason for having a flick gesture
at all is to enable flicking in legacy applications which were
designed to handle mouse wheel.  The trouble is that there then has to
be a mechanism to tell it where the "end" is, for non-legacy
applications which actually want to have the "bounce" or some other
end-of-flick behavior.  IMO that's an unfortunate break in
encapsulation; but if the applications alternatively do their own
flick physics, they are free to do it differently and inconsistently.
Same thing with other gestures.  It would be nice to put the gesture
and related behavioral stuff into a library, so that it's modular and
optional and can be replaced with an alternate one, and yet if the
same library is used everywhere, then it's consistent.  Putting this
stuff at too low a level (like inside the synaptics driver) tends to
mean that the gestures will be a fixed set, whereas it would be nice
to be able to invent new ones.

.... and you've just arrived at your favourite holiday destination. on your left you can see the rock ("I can't change anything!"), on your right the hard place ("Everyone does it differently and nothing behaves the same!"). The cooking class starts at 5 and we've got shuffleboard on the top deck.

Cheers,
  Peter


(Not that there is any framework which
makes it easy, yet...)  I think it's unfortunate if there is no way to
get the actual touch points.  It would be an acceptable compromise if
the shared gesture library can get them, and applications can get them
only by explicitly asking for them, and bypassing the gesture library.
  Then at least everyone knows of a couple of accessible places to do
the hacking to add new ones or tweak the existing ones, rather than
having to hack the things that are fixed for most users, such as
device drivers and compositors.

Wayland (and Qt on Wayland) should end up being more hackable than
Cocoa, and offer the same or better feature set, not limp along like
X11 has been.

_______________________________________________
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to