Re: Weston multitouch support?

2014-06-09 Thread Shawn Rutledge
On 3 June 2014 01:25, Peter Hutterer peter.hutte...@who-t.net wrote:
 On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
 Hi Peter,

 I have checked the libinput implementation and, correct me if I'm wrong, I
 have seen that 2 fingers click is interpreted as right click, 3 fingers
 click is  interpreted as middle click and there are some special rules for
 specified trackpads, like corner clicks.

 there are some special rules for clickpads, specifically a click with a
 finger resting on one of the software-button areas will produce a right
 or middle click.

 Does that mean that the other MT events are not sent to the clients? Could
 it be possible to get the 2 fingers pinch gesture from a QML client for
 example?

 not from a touchpad, not at this point. There are some rough plans but we've
 pretty much deferred them until we had the basics sorted with libinput.

Qt Quick was designed to take touch points directly and do its own
gesture interpretation.  But we know that we need to support gesture
events too, for OSX.  So it will be OK if pinching in Wayland is a
gesture event rather than two touchpoints, but we really do need to
have one or the other approach working.  It's unfortunate if a lot of
time goes by in which neither way works.  (Caveat: I've had a lot of
trouble getting a qtwayland compositor working well enough to use as
my main environment, although I'd really like to, so I'm not
up-to-date on what works and what doesn't at this moment)

Also in X11 I do not have multi-touch interaction with the trackpad on
my Thinkpad Helix.  I suppose it's because the synaptics driver is not
going to provide touch events, because it can only interpret a fixed
set of gestures.  The upside is that I can flick even in rxvt; the
downside is I can't do pinch gestures anywhere, because X11 protocol
definition is such a slow process that 7 years after the iPhone
introduced pinching, we still don't have a pinch event.  At some point
I was testing Qt Quick with the plain evdev driver with an Apple
Bluetooth touchpad, that used to provide the actual touch points.  It
was a better experience for Qt Quick and a worse one for everything
else.

We do need to have a good strategy for how this stuff is going to work
better in the future.  That's one purpose for the touch  gestures
session at the upcoming Qt Contributors Summit:
https://qt-project.org/groups/qt-contributors-summit-2014/wiki/Program
although I would be glad to delve deeper into X11 and Wayland
specifics beyond that session.  It would be good if any of you who
know the details could attend.

Flicking is a weird case because Qt Quick does its own physics: the
flicking continues after you release your finger, and there is the
bounce-back at the end.  On Apple platforms the QtQuick behavior
doesn't match the native one, so there are discussions about how to
fix that.  Are you thinking that on wayland the flicking should be
driven by extra events beyond the actual finger release, which keep
driving the UI to the end and then sending reversed events to generate
the bounce-back?  I think the main reason for having a flick gesture
at all is to enable flicking in legacy applications which were
designed to handle mouse wheel.  The trouble is that there then has to
be a mechanism to tell it where the end is, for non-legacy
applications which actually want to have the bounce or some other
end-of-flick behavior.  IMO that's an unfortunate break in
encapsulation; but if the applications alternatively do their own
flick physics, they are free to do it differently and inconsistently.
Same thing with other gestures.  It would be nice to put the gesture
and related behavioral stuff into a library, so that it's modular and
optional and can be replaced with an alternate one, and yet if the
same library is used everywhere, then it's consistent.  Putting this
stuff at too low a level (like inside the synaptics driver) tends to
mean that the gestures will be a fixed set, whereas it would be nice
to be able to invent new ones.  (Not that there is any framework which
makes it easy, yet...)  I think it's unfortunate if there is no way to
get the actual touch points.  It would be an acceptable compromise if
the shared gesture library can get them, and applications can get them
only by explicitly asking for them, and bypassing the gesture library.
 Then at least everyone knows of a couple of accessible places to do
the hacking to add new ones or tweak the existing ones, rather than
having to hack the things that are fixed for most users, such as
device drivers and compositors.

Wayland (and Qt on Wayland) should end up being more hackable than
Cocoa, and offer the same or better feature set, not limp along like
X11 has been.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-09 Thread Shawn Rutledge
On 3 June 2014 13:16, Peter Hutterer peter.hutte...@who-t.net wrote:
 On 3/06/2014 20:25 , Shawn Rutledge wrote:
...
 the synaptics driver does support multitouch and gives you the same type of
 events as any MT device will (if you disable the in-driver gestures). It has
 done so for about 2 years now, no-one ever cared enough about it to
 implement the client stack so this could actually work.

But is disabling in-driver gestures a global thing or can it be done
only for specific windows?  (Even doing it per-window is not quite an
ideal solution but could work some of the time)

 Here's the thing
 about the X protocol: it's not this magical self-aware thing, it's written
 by people. If no-one works on it, it doesn't change, which is pretty much
 why it updates so slowly.

 So here's a request: write down what exactly you need, what the use-cases
 are, how you want it to behave, etc. That way we can actually implement
 something useful. It's not that we're not listening, it's more that no-one
 is talking until it's too late.

OK I can try.  In what form and forum would be most helpful?

 Flicking is a weird case because Qt Quick does its own physics: the
 flicking continues after you release your finger, and there is the
 bounce-back at the end.  On Apple platforms the QtQuick behavior
 doesn't match the native one, so there are discussions about how to
 fix that.  Are you thinking that on wayland the flicking should be
 driven by extra events beyond the actual finger release, which keep
 driving the UI to the end and then sending reversed events to generate
 the bounce-back?  I think the main reason for having a flick gesture
 at all is to enable flicking in legacy applications which were
 designed to handle mouse wheel.  The trouble is that there then has to
 be a mechanism to tell it where the end is, for non-legacy
 applications which actually want to have the bounce or some other
 end-of-flick behavior.  IMO that's an unfortunate break in
 encapsulation; but if the applications alternatively do their own
 flick physics, they are free to do it differently and inconsistently.
 Same thing with other gestures.  It would be nice to put the gesture
 and related behavioral stuff into a library, so that it's modular and
 optional and can be replaced with an alternate one, and yet if the
 same library is used everywhere, then it's consistent.  Putting this
 stuff at too low a level (like inside the synaptics driver) tends to
 mean that the gestures will be a fixed set, whereas it would be nice
 to be able to invent new ones.


  and you've just arrived at your favourite holiday destination. on your
 left you can see the rock (I can't change anything!), on your right the
 hard place (Everyone does it differently and nothing behaves the same!).
 The cooking class starts at 5 and we've got shuffleboard on the top deck.

But I think a suitable degree of modularity might solve it.  It seems
in the wayland spirit, just like the debate about window decorations:
if you want common ones, use a shared library.  If you want to
decorate your own window, that's easy too.  As long as most
applications agree to use the same shared library with the same theme,
unless they have a real reason not to, then the whole desktop
experience will end up being just as consistent as in X11 when the
window manager decorates all the windows the same, but with the
advantage that some of the X11 mess goes away.

But maybe you are going to say libinput is that library.  If the
architecture is that you can have multiple compositors and each one
can use a different modified version of libinput, that sounds kindof
hackable, but it still might end up mingling device handling and
gesture recognition and the related physics a bit too much.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-09 Thread Jasper St. Pierre
Have you looked at the wl_touch protocol in the core interface at all? That
provides multiple raw touchpoints which seems to be what you're after. I
don't know if qtwaylandcompositor supports it yet, but Weston and gnome do.
On Jun 9, 2014 4:24 AM, Shawn Rutledge shawn.t.rutle...@gmail.com wrote:

 On 3 June 2014 13:16, Peter Hutterer peter.hutte...@who-t.net wrote:
  On 3/06/2014 20:25 , Shawn Rutledge wrote:
 ...
  the synaptics driver does support multitouch and gives you the same type
 of
  events as any MT device will (if you disable the in-driver gestures). It
 has
  done so for about 2 years now, no-one ever cared enough about it to
  implement the client stack so this could actually work.

 But is disabling in-driver gestures a global thing or can it be done
 only for specific windows?  (Even doing it per-window is not quite an
 ideal solution but could work some of the time)

  Here's the thing
  about the X protocol: it's not this magical self-aware thing, it's
 written
  by people. If no-one works on it, it doesn't change, which is pretty much
  why it updates so slowly.
 
  So here's a request: write down what exactly you need, what the use-cases
  are, how you want it to behave, etc. That way we can actually implement
  something useful. It's not that we're not listening, it's more that
 no-one
  is talking until it's too late.

 OK I can try.  In what form and forum would be most helpful?

  Flicking is a weird case because Qt Quick does its own physics: the
  flicking continues after you release your finger, and there is the
  bounce-back at the end.  On Apple platforms the QtQuick behavior
  doesn't match the native one, so there are discussions about how to
  fix that.  Are you thinking that on wayland the flicking should be
  driven by extra events beyond the actual finger release, which keep
  driving the UI to the end and then sending reversed events to generate
  the bounce-back?  I think the main reason for having a flick gesture
  at all is to enable flicking in legacy applications which were
  designed to handle mouse wheel.  The trouble is that there then has to
  be a mechanism to tell it where the end is, for non-legacy
  applications which actually want to have the bounce or some other
  end-of-flick behavior.  IMO that's an unfortunate break in
  encapsulation; but if the applications alternatively do their own
  flick physics, they are free to do it differently and inconsistently.
  Same thing with other gestures.  It would be nice to put the gesture
  and related behavioral stuff into a library, so that it's modular and
  optional and can be replaced with an alternate one, and yet if the
  same library is used everywhere, then it's consistent.  Putting this
  stuff at too low a level (like inside the synaptics driver) tends to
  mean that the gestures will be a fixed set, whereas it would be nice
  to be able to invent new ones.
 
 
   and you've just arrived at your favourite holiday destination. on
 your
  left you can see the rock (I can't change anything!), on your right the
  hard place (Everyone does it differently and nothing behaves the
 same!).
  The cooking class starts at 5 and we've got shuffleboard on the top deck.

 But I think a suitable degree of modularity might solve it.  It seems
 in the wayland spirit, just like the debate about window decorations:
 if you want common ones, use a shared library.  If you want to
 decorate your own window, that's easy too.  As long as most
 applications agree to use the same shared library with the same theme,
 unless they have a real reason not to, then the whole desktop
 experience will end up being just as consistent as in X11 when the
 window manager decorates all the windows the same, but with the
 advantage that some of the X11 mess goes away.

 But maybe you are going to say libinput is that library.  If the
 architecture is that you can have multiple compositors and each one
 can use a different modified version of libinput, that sounds kindof
 hackable, but it still might end up mingling device handling and
 gesture recognition and the related physics a bit too much.
 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-03 Thread José Expósito
Hi Peter,

Thank you very much for your answers

 there are some special rules for clickpads, specifically a click with a
 finger resting on one of the software-button areas will produce a right
 or middle click.
 [...]
 eventually yes, but not at this point. as I said in the previous email you
 just won't have access to the data. I think a sensible solution here is to
 have libinput send semantic events like pinch, rotate, etc. and then
 have the compositor hook into those. the actual compositor part would be
 quite small and have no actual gesture recognition, that would be done
 inside libinput. but we're just not there yet.

Is a pity that the implementation plans are these... I mean, with this
approach the clients will not be able to implement features like smooth
scroll or the compositor will not be able to manage system gesture
recognition or transform the behaviour of the trackpad to, for example
ignore touches with the thumb (while is resting on the bottom of the
clickpad) and use it to click while drag is enabled with the index finger.
Plus it'll not be possible to port apps like Touchegg or BetterTouchTool
(OS X).

Please don't misunderstand me, you guys are doing an *excellent* work with
Wayland and libinput, I only would like to point that implementing this
stuff in the clients, frameworks (Qt/GTK) and/or compositors could add some
amazing features missing at the moment in the Linux desktop but present in
compositors like SurfaceFlinger or the OS X compositor.

Maybe a flag to receive the touchpad input raw or processed could be a good
solution to everyone.

 it's fairly new and the documentation hasn't been updated yet. configure
 weston with --enable-libinput-backend and that should get you started.

Thank you very much, I'm going to recompile this afternoon Weston to have a
look to the libinput implementation


2014-06-03 0:25 GMT+01:00 Peter Hutterer peter.hutte...@who-t.net:

 On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
  Hi Peter,
 
  I have checked the libinput implementation and, correct me if I'm wrong,
 I
  have seen that 2 fingers click is interpreted as right click, 3 fingers
  click is  interpreted as middle click and there are some special rules
 for
  specified trackpads, like corner clicks.

 there are some special rules for clickpads, specifically a click with a
 finger resting on one of the software-button areas will produce a right
 or middle click.

  Does that mean that the other MT events are not sent to the clients?
 Could
  it be possible to get the 2 fingers pinch gesture from a QML client for
  example?

 not from a touchpad, not at this point. There are some rough plans but
 we've
 pretty much deferred them until we had the basics sorted with libinput.

  So mainly my question is: is it possible to port (
  https://code.google.com/p/touchegg/) as a wayland compositor, for
 example
  to manage desktop specified gestures, and still use client gestures like
  pinch and zoom?

 eventually yes, but not at this point. as I said in the previous email you
 just won't have access to the data. I think a sensible solution here is to
 have libinput send semantic events like pinch, rotate, etc. and then
 have the compositor hook into those. the actual compositor part would be
 quite small and have no actual gesture recognition, that would be done
 inside libinput. but we're just not there yet.

  By the way, I compiled Wayland/Weston as specified here:
  http://wayland.freedesktop.org/building.html
 
  And QtWayland as specified here:
  http://wayland.freedesktop.org/qt5.html
 
  But I don't see any references to the forked libinput library. Does that
  mean that I should compile libinput and recompile Wayland/Weston against
  this library instead of the system one?
 
  I'm sorry for all the questions, but I didn't find any documentation
 about
  that.

 it's fairly new and the documentation hasn't been updated yet. configure
 weston with --enable-libinput-backend and that should get you started.

 Cheers,
Peter

  2014-06-02 4:30 GMT+01:00 Peter Hutterer peter.hutte...@who-t.net:
 
   On Sun, Jun 01, 2014 at 11:38:02PM +0100, José Expósito wrote:
Hi Daniel,
   
I'm asking because I'm the author of this tool:
https://code.google.com/p/touchegg/
   
That is exactly what you mention but for X11. So I'd like to port it
 to
Wayland if it is possible of course.
   
 The intention was to reserve trackpad
 gestures for a gesture interpreter
 which lives in the compositor and is
 properly integrated with, e.g., scrolling
 and tap-to-click.
   
Does this mean that it is possible to get multi touch gestures in the
compositor at the moment?
Will or is it possible to use both approach? I mean, get system
 gestures
   in
the compositor and app specified gestures in the clients, like in OS
 X.
  
   the input stack in weston atm is that you get touch events from a
   direct-touch MT device raw and unprocessed (save 

Re: Weston multitouch support?

2014-06-03 Thread Peter Hutterer

On 3/06/2014 19:43 , José Expósito wrote:

Hi Peter,

Thank you very much for your answers

  there are some special rules for clickpads, specifically a click with a
  finger resting on one of the software-button areas will produce a right
  or middle click.
  [...]
  eventually yes, but not at this point. as I said in the previous
email you
  just won't have access to the data. I think a sensible solution here
is to
  have libinput send semantic events like pinch, rotate, etc. and then
  have the compositor hook into those. the actual compositor part would be
  quite small and have no actual gesture recognition, that would be done
  inside libinput. but we're just not there yet.

Is a pity that the implementation plans are these... I mean, with this
approach the clients will not be able to implement features like smooth
scroll or the compositor will not be able to manage system gesture
recognition or transform the behaviour of the trackpad to, for example
ignore touches with the thumb (while is resting on the bottom of the
clickpad) and use it to click while drag is enabled with the index
finger.


scroll events in libinput/wayland have a value, they're not just button 
presses like in X. if you want to implement smooth scrolling on the 
client side that can be done already.


ignoring touches with the thumb while ... - that's pretty much what 
we're doing already in libinput.


system gestures: the whole point of libinput is to provide an input 
stack for wayland compositors so we don't have to implement this 
multiple times. If we need system gestures in the compositor, we'll 
implement them in libinput.


Cheers,
  Peter


 Plus it'll not be possible to port apps like Touchegg or

BetterTouchTool (OS X).

Please don't misunderstand me, you guys are doing an *excellent* work
with Wayland and libinput, I only would like to point that implementing
this stuff in the clients, frameworks (Qt/GTK) and/or compositors could
add some amazing features missing at the moment in the Linux desktop but
present in compositors like SurfaceFlinger or the OS X compositor.

Maybe a flag to receive the touchpad input raw or processed could be a
good solution to everyone.

  it's fairly new and the documentation hasn't been updated yet. configure
  weston with --enable-libinput-backend and that should get you started.

Thank you very much, I'm going to recompile this afternoon Weston to
have a look to the libinput implementation


2014-06-03 0:25 GMT+01:00 Peter Hutterer peter.hutte...@who-t.net
mailto:peter.hutte...@who-t.net:

On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
  Hi Peter,
 
  I have checked the libinput implementation and, correct me if I'm
wrong, I
  have seen that 2 fingers click is interpreted as right click, 3
fingers
  click is  interpreted as middle click and there are some special
rules for
  specified trackpads, like corner clicks.

there are some special rules for clickpads, specifically a click with a
finger resting on one of the software-button areas will produce a right
or middle click.

  Does that mean that the other MT events are not sent to the
clients? Could
  it be possible to get the 2 fingers pinch gesture from a QML
client for
  example?

not from a touchpad, not at this point. There are some rough plans
but we've
pretty much deferred them until we had the basics sorted with libinput.

  So mainly my question is: is it possible to port (
  https://code.google.com/p/touchegg/) as a wayland compositor, for
example
  to manage desktop specified gestures, and still use client
gestures like
  pinch and zoom?

eventually yes, but not at this point. as I said in the previous
email you
just won't have access to the data. I think a sensible solution here
is to
have libinput send semantic events like pinch, rotate, etc. and then
have the compositor hook into those. the actual compositor part would be
quite small and have no actual gesture recognition, that would be done
inside libinput. but we're just not there yet.

  By the way, I compiled Wayland/Weston as specified here:
  http://wayland.freedesktop.org/building.html
 
  And QtWayland as specified here:
  http://wayland.freedesktop.org/qt5.html
 
  But I don't see any references to the forked libinput library.
Does that
  mean that I should compile libinput and recompile Wayland/Weston
against
  this library instead of the system one?
 
  I'm sorry for all the questions, but I didn't find any
documentation about
  that.

it's fairly new and the documentation hasn't been updated yet. configure
weston with --enable-libinput-backend and that should get you started.

Cheers,
Peter

  2014-06-02 4:30 GMT+01:00 Peter Hutterer
peter.hutte...@who-t.net mailto:peter.hutte...@who-t.net:
 
   On Sun, Jun 

Re: Weston multitouch support?

2014-06-03 Thread Peter Hutterer

On 3/06/2014 20:25 , Shawn Rutledge wrote:

On 3 June 2014 01:25, Peter Hutterer peter.hutte...@who-t.net wrote:

On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:

Hi Peter,

I have checked the libinput implementation and, correct me if I'm wrong, I
have seen that 2 fingers click is interpreted as right click, 3 fingers
click is  interpreted as middle click and there are some special rules for
specified trackpads, like corner clicks.


there are some special rules for clickpads, specifically a click with a
finger resting on one of the software-button areas will produce a right
or middle click.


Does that mean that the other MT events are not sent to the clients? Could
it be possible to get the 2 fingers pinch gesture from a QML client for
example?


not from a touchpad, not at this point. There are some rough plans but we've
pretty much deferred them until we had the basics sorted with libinput.


Qt Quick was designed to take touch points directly and do its own
gesture interpretation.  But we know that we need to support gesture
events too, for OSX.  So it will be OK if pinching in Wayland is a
gesture event rather than two touchpoints, but we really do need to
have one or the other approach working.  It's unfortunate if a lot of
time goes by in which neither way works.  (Caveat: I've had a lot of
trouble getting a qtwayland compositor working well enough to use as
my main environment, although I'd really like to, so I'm not
up-to-date on what works and what doesn't at this moment)

Also in X11 I do not have multi-touch interaction with the trackpad on
my Thinkpad Helix.  I suppose it's because the synaptics driver is not
going to provide touch events, because it can only interpret a fixed
set of gestures.  The upside is that I can flick even in rxvt; the
downside is I can't do pinch gestures anywhere, because X11 protocol
definition is such a slow process that 7 years after the iPhone
introduced pinching, we still don't have a pinch event.  At some point
I was testing Qt Quick with the plain evdev driver with an Apple
Bluetooth touchpad, that used to provide the actual touch points.  It
was a better experience for Qt Quick and a worse one for everything
else.


the synaptics driver does support multitouch and gives you the same type 
of events as any MT device will (if you disable the in-driver gestures). 
It has done so for about 2 years now, no-one ever cared enough about it 
to implement the client stack so this could actually work. Here's the 
thing about the X protocol: it's not this magical self-aware thing, it's 
written by people. If no-one works on it, it doesn't change, which is 
pretty much why it updates so slowly.


So here's a request: write down what exactly you need, what the 
use-cases are, how you want it to behave, etc. That way we can actually 
implement something useful. It's not that we're not listening, it's more 
that no-one is talking until it's too late.



We do need to have a good strategy for how this stuff is going to work
better in the future.  That's one purpose for the touch  gestures
session at the upcoming Qt Contributors Summit:
https://qt-project.org/groups/qt-contributors-summit-2014/wiki/Program
although I would be glad to delve deeper into X11 and Wayland
specifics beyond that session.  It would be good if any of you who
know the details could attend.

Flicking is a weird case because Qt Quick does its own physics: the
flicking continues after you release your finger, and there is the
bounce-back at the end.  On Apple platforms the QtQuick behavior
doesn't match the native one, so there are discussions about how to
fix that.  Are you thinking that on wayland the flicking should be
driven by extra events beyond the actual finger release, which keep
driving the UI to the end and then sending reversed events to generate
the bounce-back?  I think the main reason for having a flick gesture
at all is to enable flicking in legacy applications which were
designed to handle mouse wheel.  The trouble is that there then has to
be a mechanism to tell it where the end is, for non-legacy
applications which actually want to have the bounce or some other
end-of-flick behavior.  IMO that's an unfortunate break in
encapsulation; but if the applications alternatively do their own
flick physics, they are free to do it differently and inconsistently.
Same thing with other gestures.  It would be nice to put the gesture
and related behavioral stuff into a library, so that it's modular and
optional and can be replaced with an alternate one, and yet if the
same library is used everywhere, then it's consistent.  Putting this
stuff at too low a level (like inside the synaptics driver) tends to
mean that the gestures will be a fixed set, whereas it would be nice
to be able to invent new ones.


 and you've just arrived at your favourite holiday destination. on 
your left you can see the rock (I can't change anything!), on your 
right the hard place (Everyone does it 

Re: Weston multitouch support?

2014-06-03 Thread Peter Hutterer
On Tue, Jun 03, 2014 at 02:13:47PM +0200, Shawn Rutledge wrote:
 On 3 June 2014 13:16, Peter Hutterer peter.hutte...@who-t.net wrote:
  On 3/06/2014 20:25 , Shawn Rutledge wrote:
 ...
  the synaptics driver does support multitouch and gives you the same type of
  events as any MT device will (if you disable the in-driver gestures). It has
  done so for about 2 years now, no-one ever cared enough about it to
  implement the client stack so this could actually work.
 
 But is disabling in-driver gestures a global thing or can it be done
 only for specific windows?  (Even doing it per-window is not quite an
 ideal solution but could work some of the time)

it's a per-device thing, so effectively global. The driver has no knowledge
of windows, clients or anything like that. So to get this working properly
you'd have to integrate it into the toolkit, and then only use applications
from that toolkit. X having a history of everyone using whatever toolkit
they like (or none at all) this means that you can't really get a meaningful
coherent desktop experience here.

  Here's the thing
  about the X protocol: it's not this magical self-aware thing, it's written
  by people. If no-one works on it, it doesn't change, which is pretty much
  why it updates so slowly.
 
  So here's a request: write down what exactly you need, what the use-cases
  are, how you want it to behave, etc. That way we can actually implement
  something useful. It's not that we're not listening, it's more that no-one
  is talking until it's too late.
 
 OK I can try.  In what form and forum would be most helpful?

just here will do, or email me directly if you just want to have some rough
discussion first.

  Flicking is a weird case because Qt Quick does its own physics: the
  flicking continues after you release your finger, and there is the
  bounce-back at the end.  On Apple platforms the QtQuick behavior
  doesn't match the native one, so there are discussions about how to
  fix that.  Are you thinking that on wayland the flicking should be
  driven by extra events beyond the actual finger release, which keep
  driving the UI to the end and then sending reversed events to generate
  the bounce-back?  I think the main reason for having a flick gesture
  at all is to enable flicking in legacy applications which were
  designed to handle mouse wheel.  The trouble is that there then has to
  be a mechanism to tell it where the end is, for non-legacy
  applications which actually want to have the bounce or some other
  end-of-flick behavior.  IMO that's an unfortunate break in
  encapsulation; but if the applications alternatively do their own
  flick physics, they are free to do it differently and inconsistently.
  Same thing with other gestures.  It would be nice to put the gesture
  and related behavioral stuff into a library, so that it's modular and
  optional and can be replaced with an alternate one, and yet if the
  same library is used everywhere, then it's consistent.  Putting this
  stuff at too low a level (like inside the synaptics driver) tends to
  mean that the gestures will be a fixed set, whereas it would be nice
  to be able to invent new ones.
 
 
   and you've just arrived at your favourite holiday destination. on your
  left you can see the rock (I can't change anything!), on your right the
  hard place (Everyone does it differently and nothing behaves the same!).
  The cooking class starts at 5 and we've got shuffleboard on the top deck.
 
 But I think a suitable degree of modularity might solve it.  It seems
 in the wayland spirit, just like the debate about window decorations:
 if you want common ones, use a shared library.  If you want to
 decorate your own window, that's easy too.  As long as most
 applications agree to use the same shared library with the same theme,
 unless they have a real reason not to, then the whole desktop
 experience will end up being just as consistent as in X11 when the
 window manager decorates all the windows the same, but with the
 advantage that some of the X11 mess goes away.

yes, but do realise that X comes from a time where this was not realistic.
The presence of half a million window mangers are living proof.
Recent years and the emphasis on this by Wayland hopefully changed the
political landscape enough that we can expect more now, or at least ignore
those that want to go off do their own thing.

 But maybe you are going to say libinput is that library.  If the
 architecture is that you can have multiple compositors and each one
 can use a different modified version of libinput, that sounds kindof
 hackable, but it still might end up mingling device handling and
 gesture recognition and the related physics a bit too much.

I'm saying that ideally all compositors use libinput for the input stack,
without the need to hack around too much of it. libinput is suppposed to be
the one size fits all, even if in reality this will be one size doesn't
quite fit anybody but at least it 

Re: Weston multitouch support?

2014-06-02 Thread José Expósito
Hi Peter,

I have checked the libinput implementation and, correct me if I'm wrong, I
have seen that 2 fingers click is interpreted as right click, 3 fingers
click is  interpreted as middle click and there are some special rules for
specified trackpads, like corner clicks.

Does that mean that the other MT events are not sent to the clients? Could
it be possible to get the 2 fingers pinch gesture from a QML client for
example?
So mainly my question is: is it possible to port (
https://code.google.com/p/touchegg/) as a wayland compositor, for example
to manage desktop specified gestures, and still use client gestures like
pinch and zoom?

By the way, I compiled Wayland/Weston as specified here:
http://wayland.freedesktop.org/building.html

And QtWayland as specified here:
http://wayland.freedesktop.org/qt5.html

But I don't see any references to the forked libinput library. Does that
mean that I should compile libinput and recompile Wayland/Weston against
this library instead of the system one?

I'm sorry for all the questions, but I didn't find any documentation about
that.


2014-06-02 4:30 GMT+01:00 Peter Hutterer peter.hutte...@who-t.net:

 On Sun, Jun 01, 2014 at 11:38:02PM +0100, José Expósito wrote:
  Hi Daniel,
 
  I'm asking because I'm the author of this tool:
  https://code.google.com/p/touchegg/
 
  That is exactly what you mention but for X11. So I'd like to port it to
  Wayland if it is possible of course.
 
   The intention was to reserve trackpad
   gestures for a gesture interpreter
   which lives in the compositor and is
   properly integrated with, e.g., scrolling
   and tap-to-click.
 
  Does this mean that it is possible to get multi touch gestures in the
  compositor at the moment?
  Will or is it possible to use both approach? I mean, get system gestures
 in
  the compositor and app specified gestures in the clients, like in OS X.

 the input stack in weston atm is that you get touch events from a
 direct-touch MT device raw and unprocessed (save for mapping), but for
 touchpads some input events are interpreted by the stack (libinput or
 evdev-touchpad.c) and then passed on as pointer events, you don't see the
 MT
 bits of those.

 Cheers,
Peter


  Thank you very much!
   El 01/06/2014 23:24, Daniel Stone dan...@fooishbar.org escribió:
 
   Hi,
  
  
   On 1 June 2014 02:03, José Expósito jose.exposit...@gmail.com wrote:
  
   And I say more or less because it is necessary to put 3 fingers on the
   trackpad to start moving the rectangles...
   Anyway, the program is not working on Weston. My question is, is that
   because Weston doesn't implement multitouch support or because Wayland
   doesn't support it at the moment? Could it be possible to implement
   multitouch support in a custom compositor?
  
  
   Wayland doesn't (currently) support touchpad gestures for arbitrary
   clients; trying to do it for X11 uncovered a whole host of really
 subtle
   and annoying issues. The intention was to reserve trackpad gestures
 for a
   gesture interpreter which lives in the compositor and is properly
   integrated with, e.g., scrolling and tap-to-click.
  
   Can I ask if you had a specific usecase in mind?
  
   Cheers,
   Daniel
  

  ___
  wayland-devel mailing list
  wayland-devel@lists.freedesktop.org
  http://lists.freedesktop.org/mailman/listinfo/wayland-devel


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-02 Thread Peter Hutterer
On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
 Hi Peter,
 
 I have checked the libinput implementation and, correct me if I'm wrong, I
 have seen that 2 fingers click is interpreted as right click, 3 fingers
 click is  interpreted as middle click and there are some special rules for
 specified trackpads, like corner clicks.

there are some special rules for clickpads, specifically a click with a
finger resting on one of the software-button areas will produce a right
or middle click.

 Does that mean that the other MT events are not sent to the clients? Could
 it be possible to get the 2 fingers pinch gesture from a QML client for
 example?

not from a touchpad, not at this point. There are some rough plans but we've
pretty much deferred them until we had the basics sorted with libinput.

 So mainly my question is: is it possible to port (
 https://code.google.com/p/touchegg/) as a wayland compositor, for example
 to manage desktop specified gestures, and still use client gestures like
 pinch and zoom?

eventually yes, but not at this point. as I said in the previous email you
just won't have access to the data. I think a sensible solution here is to
have libinput send semantic events like pinch, rotate, etc. and then
have the compositor hook into those. the actual compositor part would be
quite small and have no actual gesture recognition, that would be done
inside libinput. but we're just not there yet.

 By the way, I compiled Wayland/Weston as specified here:
 http://wayland.freedesktop.org/building.html
 
 And QtWayland as specified here:
 http://wayland.freedesktop.org/qt5.html
 
 But I don't see any references to the forked libinput library. Does that
 mean that I should compile libinput and recompile Wayland/Weston against
 this library instead of the system one?
 
 I'm sorry for all the questions, but I didn't find any documentation about
 that.

it's fairly new and the documentation hasn't been updated yet. configure
weston with --enable-libinput-backend and that should get you started.

Cheers,
   Peter

 2014-06-02 4:30 GMT+01:00 Peter Hutterer peter.hutte...@who-t.net:
 
  On Sun, Jun 01, 2014 at 11:38:02PM +0100, José Expósito wrote:
   Hi Daniel,
  
   I'm asking because I'm the author of this tool:
   https://code.google.com/p/touchegg/
  
   That is exactly what you mention but for X11. So I'd like to port it to
   Wayland if it is possible of course.
  
The intention was to reserve trackpad
gestures for a gesture interpreter
which lives in the compositor and is
properly integrated with, e.g., scrolling
and tap-to-click.
  
   Does this mean that it is possible to get multi touch gestures in the
   compositor at the moment?
   Will or is it possible to use both approach? I mean, get system gestures
  in
   the compositor and app specified gestures in the clients, like in OS X.
 
  the input stack in weston atm is that you get touch events from a
  direct-touch MT device raw and unprocessed (save for mapping), but for
  touchpads some input events are interpreted by the stack (libinput or
  evdev-touchpad.c) and then passed on as pointer events, you don't see the
  MT
  bits of those.
 
  Cheers,
 Peter
 
 
   Thank you very much!
El 01/06/2014 23:24, Daniel Stone dan...@fooishbar.org escribió:
  
Hi,
   
   
On 1 June 2014 02:03, José Expósito jose.exposit...@gmail.com wrote:
   
And I say more or less because it is necessary to put 3 fingers on the
trackpad to start moving the rectangles...
Anyway, the program is not working on Weston. My question is, is that
because Weston doesn't implement multitouch support or because Wayland
doesn't support it at the moment? Could it be possible to implement
multitouch support in a custom compositor?
   
   
Wayland doesn't (currently) support touchpad gestures for arbitrary
clients; trying to do it for X11 uncovered a whole host of really
  subtle
and annoying issues. The intention was to reserve trackpad gestures
  for a
gesture interpreter which lives in the compositor and is properly
integrated with, e.g., scrolling and tap-to-click.
   
Can I ask if you had a specific usecase in mind?
   
Cheers,
Daniel
   
 
   ___
   wayland-devel mailing list
   wayland-devel@lists.freedesktop.org
   http://lists.freedesktop.org/mailman/listinfo/wayland-devel
 
 
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-01 Thread Daniel Stone
Hi,


On 1 June 2014 02:03, José Expósito jose.exposit...@gmail.com wrote:

 And I say more or less because it is necessary to put 3 fingers on the
 trackpad to start moving the rectangles...
 Anyway, the program is not working on Weston. My question is, is that
 because Weston doesn't implement multitouch support or because Wayland
 doesn't support it at the moment? Could it be possible to implement
 multitouch support in a custom compositor?


Wayland doesn't (currently) support touchpad gestures for arbitrary
clients; trying to do it for X11 uncovered a whole host of really subtle
and annoying issues. The intention was to reserve trackpad gestures for a
gesture interpreter which lives in the compositor and is properly
integrated with, e.g., scrolling and tap-to-click.

Can I ask if you had a specific usecase in mind?

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-01 Thread José Expósito
Hi Daniel,

I'm asking because I'm the author of this tool:
https://code.google.com/p/touchegg/

That is exactly what you mention but for X11. So I'd like to port it to
Wayland if it is possible of course.

 The intention was to reserve trackpad
 gestures for a gesture interpreter
 which lives in the compositor and is
 properly integrated with, e.g., scrolling
 and tap-to-click.

Does this mean that it is possible to get multi touch gestures in the
compositor at the moment?
Will or is it possible to use both approach? I mean, get system gestures in
the compositor and app specified gestures in the clients, like in OS X.

Thank you very much!
 El 01/06/2014 23:24, Daniel Stone dan...@fooishbar.org escribió:

 Hi,


 On 1 June 2014 02:03, José Expósito jose.exposit...@gmail.com wrote:

 And I say more or less because it is necessary to put 3 fingers on the
 trackpad to start moving the rectangles...
 Anyway, the program is not working on Weston. My question is, is that
 because Weston doesn't implement multitouch support or because Wayland
 doesn't support it at the moment? Could it be possible to implement
 multitouch support in a custom compositor?


 Wayland doesn't (currently) support touchpad gestures for arbitrary
 clients; trying to do it for X11 uncovered a whole host of really subtle
 and annoying issues. The intention was to reserve trackpad gestures for a
 gesture interpreter which lives in the compositor and is properly
 integrated with, e.g., scrolling and tap-to-click.

 Can I ask if you had a specific usecase in mind?

 Cheers,
 Daniel

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-06-01 Thread Peter Hutterer
On Sun, Jun 01, 2014 at 11:38:02PM +0100, José Expósito wrote:
 Hi Daniel,
 
 I'm asking because I'm the author of this tool:
 https://code.google.com/p/touchegg/
 
 That is exactly what you mention but for X11. So I'd like to port it to
 Wayland if it is possible of course.
 
  The intention was to reserve trackpad
  gestures for a gesture interpreter
  which lives in the compositor and is
  properly integrated with, e.g., scrolling
  and tap-to-click.
 
 Does this mean that it is possible to get multi touch gestures in the
 compositor at the moment?
 Will or is it possible to use both approach? I mean, get system gestures in
 the compositor and app specified gestures in the clients, like in OS X.

the input stack in weston atm is that you get touch events from a
direct-touch MT device raw and unprocessed (save for mapping), but for
touchpads some input events are interpreted by the stack (libinput or
evdev-touchpad.c) and then passed on as pointer events, you don't see the MT
bits of those.

Cheers,
   Peter


 Thank you very much!
  El 01/06/2014 23:24, Daniel Stone dan...@fooishbar.org escribió:
 
  Hi,
 
 
  On 1 June 2014 02:03, José Expósito jose.exposit...@gmail.com wrote:
 
  And I say more or less because it is necessary to put 3 fingers on the
  trackpad to start moving the rectangles...
  Anyway, the program is not working on Weston. My question is, is that
  because Weston doesn't implement multitouch support or because Wayland
  doesn't support it at the moment? Could it be possible to implement
  multitouch support in a custom compositor?
 
 
  Wayland doesn't (currently) support touchpad gestures for arbitrary
  clients; trying to do it for X11 uncovered a whole host of really subtle
  and annoying issues. The intention was to reserve trackpad gestures for a
  gesture interpreter which lives in the compositor and is properly
  integrated with, e.g., scrolling and tap-to-click.
 
  Can I ask if you had a specific usecase in mind?
 
  Cheers,
  Daniel
 

 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-05-31 Thread Boyan Ding
Which backend are you running on?

On Sun, 2014-06-01 at 02:03 +0100, José Expósito wrote:
 Hi all,
 
 
 I'm running a very simple QML multitouch example using
 MultiPointTouchArea, that works (more or less) on X11, here is the
 code:
 
 
 Rectangle {
 width: 400; height: 400
 MultiPointTouchArea {
 anchors.fill: parent
 touchPoints: [ TouchPoint { id: point1 }, TouchPoint { id:
 point2 } ]
 }
 Rectangle { width: 30; height: 30; color: green;  x:
 point1.x; y: point1.y }
 Rectangle { width: 30; height: 30; color: yellow; x:
 point2.x; y: point2.y }
 }
 
 
 And I say more or less because it is necessary to put 3 fingers on the
 trackpad to start moving the rectangles...
 Anyway, the program is not working on Weston. My question is, is that
 because Weston doesn't implement multitouch support or because Wayland
 doesn't support it at the moment? Could it be possible to implement
 multitouch support in a custom compositor?
 
 
 This is my system information:
 Macbook Air 2011 (clickpack)
 Qt 5.3.0
 Latest QtWayland source code (1/Jun/2014)
 Weston 1.5.90
 
 
 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel



___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-05-31 Thread José Expósito
Thank you for the quick answer.
It happens if I run Weston as X client (weston command) or using
weston-launch.

I'm not sure how can I check the backend...
El 01/06/2014 02:50, Boyan Ding stu_...@126.com escribió:

 Which backend are you running on?

 On Sun, 2014-06-01 at 02:03 +0100, José Expósito wrote:
  Hi all,
 
 
  I'm running a very simple QML multitouch example using
  MultiPointTouchArea, that works (more or less) on X11, here is the
  code:
 
 
  Rectangle {
  width: 400; height: 400
  MultiPointTouchArea {
  anchors.fill: parent
  touchPoints: [ TouchPoint { id: point1 }, TouchPoint { id:
  point2 } ]
  }
  Rectangle { width: 30; height: 30; color: green;  x:
  point1.x; y: point1.y }
  Rectangle { width: 30; height: 30; color: yellow; x:
  point2.x; y: point2.y }
  }
 
 
  And I say more or less because it is necessary to put 3 fingers on the
  trackpad to start moving the rectangles...
  Anyway, the program is not working on Weston. My question is, is that
  because Weston doesn't implement multitouch support or because Wayland
  doesn't support it at the moment? Could it be possible to implement
  multitouch support in a custom compositor?
 
 
  This is my system information:
  Macbook Air 2011 (clickpack)
  Qt 5.3.0
  Latest QtWayland source code (1/Jun/2014)
  Weston 1.5.90
 
 
  ___
  wayland-devel mailing list
  wayland-devel@lists.freedesktop.org
  http://lists.freedesktop.org/mailman/listinfo/wayland-devel




___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-05-31 Thread Boyan Ding
If you're using weston-launch there may be a problem somewhere -- I'm
not an expert in that. But I know there're some backends (e.g. nested
wayland) which doesn't support touch at all at present.

On Sun, 2014-06-01 at 03:24 +0100, José Expósito wrote:
 Thank you for the quick answer.
 It happens if I run Weston as X client (weston command) or using
 weston-launch.
 
 I'm not sure how can I check the backend...
 
 El 01/06/2014 02:50, Boyan Ding stu_...@126.com escribió:
 Which backend are you running on?
 
 On Sun, 2014-06-01 at 02:03 +0100, José Expósito wrote:
  Hi all,
 
 
  I'm running a very simple QML multitouch example using
  MultiPointTouchArea, that works (more or less) on X11, here
 is the
  code:
 
 
  Rectangle {
  width: 400; height: 400
  MultiPointTouchArea {
  anchors.fill: parent
  touchPoints: [ TouchPoint { id: point1 },
 TouchPoint { id:
  point2 } ]
  }
  Rectangle { width: 30; height: 30; color: green;
  x:
  point1.x; y: point1.y }
  Rectangle { width: 30; height: 30; color: yellow;
 x:
  point2.x; y: point2.y }
  }
 
 
  And I say more or less because it is necessary to put 3
 fingers on the
  trackpad to start moving the rectangles...
  Anyway, the program is not working on Weston. My question
 is, is that
  because Weston doesn't implement multitouch support or
 because Wayland
  doesn't support it at the moment? Could it be possible to
 implement
  multitouch support in a custom compositor?
 
 
  This is my system information:
  Macbook Air 2011 (clickpack)
  Qt 5.3.0
  Latest QtWayland source code (1/Jun/2014)
  Weston 1.5.90
 
 
  ___
  wayland-devel mailing list
  wayland-devel@lists.freedesktop.org
  http://lists.freedesktop.org/mailman/listinfo/wayland-devel
 
 
 
 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel



___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Weston multitouch support?

2014-05-31 Thread José Expósito
It is happening with both weston and weston-launch. Do you know if the
multi touch input should work? Is it a problem in my configuration?
El 01/06/2014 03:38, Boyan Ding stu_...@126.com escribió:

 If you're using weston-launch there may be a problem somewhere -- I'm
 not an expert in that. But I know there're some backends (e.g. nested
 wayland) which doesn't support touch at all at present.

 On Sun, 2014-06-01 at 03:24 +0100, José Expósito wrote:
  Thank you for the quick answer.
  It happens if I run Weston as X client (weston command) or using
  weston-launch.
 
  I'm not sure how can I check the backend...
 
  El 01/06/2014 02:50, Boyan Ding stu_...@126.com escribió:
  Which backend are you running on?
 
  On Sun, 2014-06-01 at 02:03 +0100, José Expósito wrote:
   Hi all,
  
  
   I'm running a very simple QML multitouch example using
   MultiPointTouchArea, that works (more or less) on X11, here
  is the
   code:
  
  
   Rectangle {
   width: 400; height: 400
   MultiPointTouchArea {
   anchors.fill: parent
   touchPoints: [ TouchPoint { id: point1 },
  TouchPoint { id:
   point2 } ]
   }
   Rectangle { width: 30; height: 30; color: green;
   x:
   point1.x; y: point1.y }
   Rectangle { width: 30; height: 30; color: yellow;
  x:
   point2.x; y: point2.y }
   }
  
  
   And I say more or less because it is necessary to put 3
  fingers on the
   trackpad to start moving the rectangles...
   Anyway, the program is not working on Weston. My question
  is, is that
   because Weston doesn't implement multitouch support or
  because Wayland
   doesn't support it at the moment? Could it be possible to
  implement
   multitouch support in a custom compositor?
  
  
   This is my system information:
   Macbook Air 2011 (clickpack)
   Qt 5.3.0
   Latest QtWayland source code (1/Jun/2014)
   Weston 1.5.90
  
  
   ___
   wayland-devel mailing list
   wayland-devel@lists.freedesktop.org
   http://lists.freedesktop.org/mailman/listinfo/wayland-devel
 
 
 
  ___
  wayland-devel mailing list
  wayland-devel@lists.freedesktop.org
  http://lists.freedesktop.org/mailman/listinfo/wayland-devel




___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel