Re: [RFC DRAFT] graphics tablet protocol extension
Hi Peter On Wed, Oct 2, 2013 at 11:13 PM, Peter Hutterer peter.hutte...@who-t.net wrote: On Wed, Oct 02, 2013 at 05:44:29PM +0200, David Herrmann wrote: Hi Peter On Fri, Sep 20, 2013 at 12:35 PM, Peter Hutterer diff --git a/protocol/wayland.xml b/protocol/wayland.xml index aeb0412..8d10746 100644 --- a/protocol/wayland.xml +++ b/protocol/wayland.xml @@ -1235,7 +1235,7 @@ /request /interface - interface name=wl_seat version=3 + interface name=wl_seat version=4 description summary=group of input devices A seat is a group of keyboards, pointer and touch devices. This object is published as a global during start up, or when such a @@ -1251,6 +1251,7 @@ entry name=pointer value=1 summary=The seat has pointer devices/ entry name=keyboard value=2 summary=The seat has one or more keyboards/ entry name=touch value=4 summary=The seat has touch devices/ + entry name=tablets value=8 summary=The seat has one or more graphics tablet devices since=4/ What's actually the reason to allow multiple graphics-tablets per seat? I thought wl_seat objects respresent a single user interacting with your desktop. So if you have two tablets, why not force them to be in two different wl_seat objects? Is there ever a reason to have multiple tablets in a single seat? What would the use-case be? my laptop has a built-in wacom tablet (lenovo x220t), but I also have an Intuos4 plugged in. depending on the use-case I use either, but they should be in the same seat. two wl_seat objects means two different keyboard foci (only one of which actually works if you have only one keyboard), etc. I also know of professional setups that require two tablets or more. ideally there'd be just one generic wl_tablet per seat, same as wl_pointer but the tablets differ too much to make that useable. Hence the multiple wl_tablet objects. Yepp, merging them might be not as easy as with keyboards/mice. Point taken. We could even go further and put tablets into their own seats. Always. A pointer and keyboard may be used by a single user at the same time. But for a graphics-tablet that doesn't sound like a legitimate use-case, does it? But maybe I just have a different view of wl_seat.. don't know. I somehow have the feeling we never really agreed on how to define wl_seat objects. Currently they are just user-defined groups of devices with some specific policies (only one keyboard-object per seat). my understanding of wl_seat is one group of input devices. that would usually map to one user. it's already not ideal for the bimanual input but then again that's even more of a corner case than multiple seats. once you start moving devices out into other seats, you're in trouble. a seat defines keyboard events, so anything that has an interaction with the keyboard can't easily be unpaired. a tablet is a fancy pointer device, users expect to click somewhere and have it work as a mouse, i.e. to move the keyboard focus to wherever you just clicked. moving it into a different seat means you'd have to switch to a mouse to change the keyboard focus. or, if you have two wl_seats you now need to pair them somehow to have the right focus behaviour. and that is going to be horrible. Ok, so wl_seat is just about input focus. I had a much stricter view of it but obviously it makes it hard to control focus of seats without mice/keyboards. So yeah, agreed. /enum event name=capabilities @@ -1306,6 +1307,19 @@ arg name=name type=string/ /event +!-- Version 4 additions -- +request name=get_tablet_manager since=4 + description summary=return tablet manager object +The ID provided will be initialized to the wl_tablet_manager +interface for this seat. This can then be used to retrieve the +objects representing the actual tablet devices. + +This request only takes effect if the seat has the tablets +capability. + /description + arg name=id type=new_id interface=wl_tablet_manager/ +/request + /interface interface name=wl_pointer version=3 @@ -1617,6 +1631,223 @@ /event /interface + interface name=wl_tablet_manager version=1 +description summary=controller object for graphic tablet devices + A tablet manager object provides requests to access the graphics + tablets available on this system. +/description + +enum name=tablet_type + description summary=tablet type +Describes the type of tablet. + /description + entry name=external value=0 summary=The tablet is an external tablet, such as an Intuos / + entry name=internal value=1 summary=The tablet is an built-in tablet, usually in a laptop / + entry name=display value =2 summary=The tablet is a display tablet, such as a Cintiq / +/enum + +
Re: [RFC DRAFT] graphics tablet protocol extension
Hi Peter On Fri, Sep 20, 2013 at 12:35 PM, Peter Hutterer peter.hutte...@who-t.net wrote: I've been working on a protocol extension to support graphics tablets such as the Wacom set of tablets, and I'm now at the stage where I'd like a few comments. I was hoping that I'd get a full implementation before XDC but unfortunately that didn't happen, so for now I'll just show the protocol. I've got a PoC implementation, but it's missing a few too many pieces to be actually usable just yet. Any feedback appreciated. This is a relatively early stage, so there are still many changes expected. The goal is to make it possible to access graphics tablets. One important thing to note is that this interface does _not_ cover the touch ability of some tablets. This should go through wl_touch (for touchscreen-like tablets) or wl_pointer (for external touchpad-like tablets). There are a few notable differences to the wl_pointer interface: * tablets have a tool type that matters, it lets applications such as the GIMP select paint tools based on the physical tool. those tools also often have HW serial numbers to uniquely identify them. * extra axes mattter: pressure, tilt, distance - all influence the stylus behaviour * more than one device may be present, so it's important to have access to all devices on a one-by-one basis. unlike wl_pointer, where we just have one virtual pointer. * proximity matters since we can leave proximity from directly above a surface. the pointer can't do that, it moves from one surface to the next. so in some ways it's closer to wl_touch in that regard. Some design notes: * generally most axes change at the same time, hence the choice to send a wl_array instead of separate events. * x/y would have to be adjusted relative to the surface, but technically the same would have to be done to e.g. distance on a true 3D desktop. * not sure at all about the relative events at all or if there's a need for it. iirc only some styly have REL_*WHEEL, do we need something else? Ping, Jason? * I don't have a specific touch event, I figured BTN_TOUCH would do the job. * focus handling for the stylus is easy. focus handling for the buttons on the pad isn't. they could technically be focused elsewhere, like a keyboard focus. some buttons are definitely stylus based (BTN_STYLUS, BTN_STYLUS2, etc.) so should go where the stylus is. Should look at what Win/OSX do here. * bind/binding/unbind - this is like windowed mode in GIMP. do we still need this? who's actually using this instead of a full-screen app? * tablet_manager is a bit meh, but the only alternative would be to have a wl_seat::get_tablets request and a wl_seat::tablet_added event. Possible, but doesn't look that much nicer. but it does away with the indirection. (read the diff below to understand what I mean here) * if we stick with the tablet_manager, do we need a wl_tablet_manager::get_tablets request in case the client releases a tablet it needs again later? or do we expect to re-bind to wl_tablet_manager? * fuzz/flat should be dropped, I just haven't yet. * I'd really like to enscribe in the protocol that ABS_PRESSURE means just that and damn anyone else who wants to use 0x18 as axis code. weston does this for wl_pointer::button, but it's not actually documented. what's the deal here? * does this even make sense as wl_tablet or should I try first as experimental weston-tablet interface that then (maybe) moves later to wayland proper. That's it so far, again, any feedback appreciated. diff below. Cheers, Peter diff --git a/protocol/wayland.xml b/protocol/wayland.xml index aeb0412..8d10746 100644 --- a/protocol/wayland.xml +++ b/protocol/wayland.xml @@ -1235,7 +1235,7 @@ /request /interface - interface name=wl_seat version=3 + interface name=wl_seat version=4 description summary=group of input devices A seat is a group of keyboards, pointer and touch devices. This object is published as a global during start up, or when such a @@ -1251,6 +1251,7 @@ entry name=pointer value=1 summary=The seat has pointer devices/ entry name=keyboard value=2 summary=The seat has one or more keyboards/ entry name=touch value=4 summary=The seat has touch devices/ + entry name=tablets value=8 summary=The seat has one or more graphics tablet devices since=4/ What's actually the reason to allow multiple graphics-tablets per seat? I thought wl_seat objects respresent a single user interacting with your desktop. So if you have two tablets, why not force them to be in two different wl_seat objects? Is there ever a reason to have multiple tablets in a single seat? What would the use-case be? We could even go further and put tablets into their own seats. Always. A pointer and keyboard may be used by a single user at the same time. But for a graphics-tablet that doesn't sound like a
Re: [RFC DRAFT] graphics tablet protocol extension
On Wed, Oct 02, 2013 at 05:44:29PM +0200, David Herrmann wrote: Hi Peter On Fri, Sep 20, 2013 at 12:35 PM, Peter Hutterer peter.hutte...@who-t.net wrote: I've been working on a protocol extension to support graphics tablets such as the Wacom set of tablets, and I'm now at the stage where I'd like a few comments. I was hoping that I'd get a full implementation before XDC but unfortunately that didn't happen, so for now I'll just show the protocol. I've got a PoC implementation, but it's missing a few too many pieces to be actually usable just yet. Any feedback appreciated. This is a relatively early stage, so there are still many changes expected. The goal is to make it possible to access graphics tablets. One important thing to note is that this interface does _not_ cover the touch ability of some tablets. This should go through wl_touch (for touchscreen-like tablets) or wl_pointer (for external touchpad-like tablets). There are a few notable differences to the wl_pointer interface: * tablets have a tool type that matters, it lets applications such as the GIMP select paint tools based on the physical tool. those tools also often have HW serial numbers to uniquely identify them. * extra axes mattter: pressure, tilt, distance - all influence the stylus behaviour * more than one device may be present, so it's important to have access to all devices on a one-by-one basis. unlike wl_pointer, where we just have one virtual pointer. * proximity matters since we can leave proximity from directly above a surface. the pointer can't do that, it moves from one surface to the next. so in some ways it's closer to wl_touch in that regard. Some design notes: * generally most axes change at the same time, hence the choice to send a wl_array instead of separate events. * x/y would have to be adjusted relative to the surface, but technically the same would have to be done to e.g. distance on a true 3D desktop. * not sure at all about the relative events at all or if there's a need for it. iirc only some styly have REL_*WHEEL, do we need something else? Ping, Jason? * I don't have a specific touch event, I figured BTN_TOUCH would do the job. * focus handling for the stylus is easy. focus handling for the buttons on the pad isn't. they could technically be focused elsewhere, like a keyboard focus. some buttons are definitely stylus based (BTN_STYLUS, BTN_STYLUS2, etc.) so should go where the stylus is. Should look at what Win/OSX do here. * bind/binding/unbind - this is like windowed mode in GIMP. do we still need this? who's actually using this instead of a full-screen app? * tablet_manager is a bit meh, but the only alternative would be to have a wl_seat::get_tablets request and a wl_seat::tablet_added event. Possible, but doesn't look that much nicer. but it does away with the indirection. (read the diff below to understand what I mean here) * if we stick with the tablet_manager, do we need a wl_tablet_manager::get_tablets request in case the client releases a tablet it needs again later? or do we expect to re-bind to wl_tablet_manager? * fuzz/flat should be dropped, I just haven't yet. * I'd really like to enscribe in the protocol that ABS_PRESSURE means just that and damn anyone else who wants to use 0x18 as axis code. weston does this for wl_pointer::button, but it's not actually documented. what's the deal here? * does this even make sense as wl_tablet or should I try first as experimental weston-tablet interface that then (maybe) moves later to wayland proper. That's it so far, again, any feedback appreciated. diff below. Cheers, Peter diff --git a/protocol/wayland.xml b/protocol/wayland.xml index aeb0412..8d10746 100644 --- a/protocol/wayland.xml +++ b/protocol/wayland.xml @@ -1235,7 +1235,7 @@ /request /interface - interface name=wl_seat version=3 + interface name=wl_seat version=4 description summary=group of input devices A seat is a group of keyboards, pointer and touch devices. This object is published as a global during start up, or when such a @@ -1251,6 +1251,7 @@ entry name=pointer value=1 summary=The seat has pointer devices/ entry name=keyboard value=2 summary=The seat has one or more keyboards/ entry name=touch value=4 summary=The seat has touch devices/ + entry name=tablets value=8 summary=The seat has one or more graphics tablet devices since=4/ What's actually the reason to allow multiple graphics-tablets per seat? I thought wl_seat objects respresent a single user interacting with your desktop. So if you have two tablets, why not force them to be in two different wl_seat objects? Is there ever a reason to have multiple tablets in a single seat? What would the use-case be? my laptop has a built-in wacom
Re: [RFC DRAFT] graphics tablet protocol extension
On 09/20/2013 03:35 AM, Peter Hutterer wrote: * focus handling for the stylus is easy. focus handling for the buttons on the pad isn't. they could technically be focused elsewhere, like a keyboard focus. some buttons are definitely stylus based (BTN_STYLUS, BTN_STYLUS2, etc.) so should go where the stylus is. Should look at what Win/OSX do here. I tried both Windows 7 and OS/X with a Wacom Intuos3 6x8. The buttons act precisely like whatever they are emulating. They can be set to be a set of modifier keys, or a single mouse button click, or a sequence of keystrokes with modifiers, and the result is exactly as though I quickly hit the same things on the main keyboard and mouse. If a button sends keystrokes then they go to the application with the keyboard focus even if the mouse is pointing somewhere else. If they send a shift modifier then they cause keys typed on the keyboard to be uppercase. Setting them to clicks causes the clicks to go to the application under the mouse cursor, activating those applications. This is more apparent under Windows where the first click in an un-activated app is still handled, on OS/X it just activates it and throws away the click (I believe Cocoa apps can see this click, but all ignore it to match OS/X UI guidelines). On Windows, by using Alt+Tab, I was able to make a button type into two different applications! Note that both platforms had a method to make all the button assignments depend on the current application. There are also two vertical strips (just really small pads) next to the buttons. They may be limited to detecting the vertical position of the pen, but I would not be suprised if they are in fact more of the main tablet area and can pick up pressure and tilt and some horizontal positions. These could also be programmed to send keystrokes and they acted like the buttons. Now for my opinion: This seems to be an implementation detail and I think it is ok if the effect of buttons goes only to the app with the pointer focus. The default setups make the buttons act like all the different modifiers so you can do shift+click on the pad. On OS/X one of the buttons is set so when held down it causes the pen to send x/y mouse scrolling events. The small vertical strips are set in both cases to smart scroll which it is unclear what it does, and imho setting them to the scroll wheel works better (smart scroll may be an attempt to work around Window's very broken policy of sending the scroll wheel to the keyboard focus). I think this covers what is needed for tablet unaware applications. I suspect that the modifiers only have to work at the moment a click is sent from the pen, though working the way Windows and OS/X does might be the easiest implementation but users are not relying on it. The reason for the keystrokes is obviously to trigger shortcuts in the application the pen is being used on, such as to switch tools in Photoshop. I suspect it would be ok if they went to the client with pointer focus (it could ignore them if it does not have keyboard focus if that helps). They can also be unique buttons, it would be much better if the client provided the api to change what the buttons do. ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
[RFC DRAFT] graphics tablet protocol extension
I've been working on a protocol extension to support graphics tablets such as the Wacom set of tablets, and I'm now at the stage where I'd like a few comments. I was hoping that I'd get a full implementation before XDC but unfortunately that didn't happen, so for now I'll just show the protocol. I've got a PoC implementation, but it's missing a few too many pieces to be actually usable just yet. Any feedback appreciated. This is a relatively early stage, so there are still many changes expected. The goal is to make it possible to access graphics tablets. One important thing to note is that this interface does _not_ cover the touch ability of some tablets. This should go through wl_touch (for touchscreen-like tablets) or wl_pointer (for external touchpad-like tablets). There are a few notable differences to the wl_pointer interface: * tablets have a tool type that matters, it lets applications such as the GIMP select paint tools based on the physical tool. those tools also often have HW serial numbers to uniquely identify them. * extra axes mattter: pressure, tilt, distance - all influence the stylus behaviour * more than one device may be present, so it's important to have access to all devices on a one-by-one basis. unlike wl_pointer, where we just have one virtual pointer. * proximity matters since we can leave proximity from directly above a surface. the pointer can't do that, it moves from one surface to the next. so in some ways it's closer to wl_touch in that regard. Some design notes: * generally most axes change at the same time, hence the choice to send a wl_array instead of separate events. * x/y would have to be adjusted relative to the surface, but technically the same would have to be done to e.g. distance on a true 3D desktop. * not sure at all about the relative events at all or if there's a need for it. iirc only some styly have REL_*WHEEL, do we need something else? Ping, Jason? * I don't have a specific touch event, I figured BTN_TOUCH would do the job. * focus handling for the stylus is easy. focus handling for the buttons on the pad isn't. they could technically be focused elsewhere, like a keyboard focus. some buttons are definitely stylus based (BTN_STYLUS, BTN_STYLUS2, etc.) so should go where the stylus is. Should look at what Win/OSX do here. * bind/binding/unbind - this is like windowed mode in GIMP. do we still need this? who's actually using this instead of a full-screen app? * tablet_manager is a bit meh, but the only alternative would be to have a wl_seat::get_tablets request and a wl_seat::tablet_added event. Possible, but doesn't look that much nicer. but it does away with the indirection. (read the diff below to understand what I mean here) * if we stick with the tablet_manager, do we need a wl_tablet_manager::get_tablets request in case the client releases a tablet it needs again later? or do we expect to re-bind to wl_tablet_manager? * fuzz/flat should be dropped, I just haven't yet. * I'd really like to enscribe in the protocol that ABS_PRESSURE means just that and damn anyone else who wants to use 0x18 as axis code. weston does this for wl_pointer::button, but it's not actually documented. what's the deal here? * does this even make sense as wl_tablet or should I try first as experimental weston-tablet interface that then (maybe) moves later to wayland proper. That's it so far, again, any feedback appreciated. diff below. Cheers, Peter diff --git a/protocol/wayland.xml b/protocol/wayland.xml index aeb0412..8d10746 100644 --- a/protocol/wayland.xml +++ b/protocol/wayland.xml @@ -1235,7 +1235,7 @@ /request /interface - interface name=wl_seat version=3 + interface name=wl_seat version=4 description summary=group of input devices A seat is a group of keyboards, pointer and touch devices. This object is published as a global during start up, or when such a @@ -1251,6 +1251,7 @@ entry name=pointer value=1 summary=The seat has pointer devices/ entry name=keyboard value=2 summary=The seat has one or more keyboards/ entry name=touch value=4 summary=The seat has touch devices/ + entry name=tablets value=8 summary=The seat has one or more graphics tablet devices since=4/ /enum event name=capabilities @@ -1306,6 +1307,19 @@ arg name=name type=string/ /event +!-- Version 4 additions -- +request name=get_tablet_manager since=4 + description summary=return tablet manager object +The ID provided will be initialized to the wl_tablet_manager +interface for this seat. This can then be used to retrieve the +objects representing the actual tablet devices. + +This request only takes effect if the seat has the tablets +capability. + /description + arg name=id type=new_id interface=wl_tablet_manager/ +/request + /interface interface name=wl_pointer version=3 @@ -1617,6