On Thu, Aug 15, 2013 at 02:48:48PM -0700, Kristian Høgsberg wrote: > On Thu, Aug 15, 2013 at 12:10:30PM +0100, Daniel Stone wrote: > > Hi, > > > > On 15 August 2013 11:52, Peter Hutterer <peter.hutte...@who-t.net> wrote: > > > one of the things that should be done is to figure out _where_ features > > > such > > > as this are going to be handled. In the compositor, the compositor's input > > > module, on the client side, ... ? I'm trying to figure out how to handle > > > this correctly, but don't have much to show here just yet. > > > > > > For example, wl_pointer only has a button event, which means that a client > > > cannot differ between a tap and a button click. no doubt this should be > > > in a > > > piece of shared code, but right now it's not quite sure what can be > > > shared where yet. > > > > ... does it need to? > > I don't think we do... I think that from a client point of view, a > touchpad/clickpad looks exactly like a wl_pointer. Gestures such as > two-finger scrolling, double-tap-drag, two-finger clicks, interpreting > clicking in different as different buttons etc can all be done in the > compositor and sent as either wl_pointer button and move events or > axis events (scroll, pinch, rotate).
then you need gestures on the protocol. tapping is relatively uncomplicated, scrolling somewhere along the same lines (until you consider that swipe and two-finger scroll are overlapping and context-dependent). pinch and rotate require either the raw touch point data for re-interpretation, or at least the delta positions, possibly the pressure of the touchpoints, etc. then you need the number of touchpoints, and to tell that the client beforehand so they know whether they can expect a four-finger gesture or not. Cheers, Peter _______________________________________________ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel