I think the question is why both the client, the input method, and probably the compositor all have to do the decoding from keys to keysyms.

Contrast this to all the work being done with libinput to translate actions on touchpads to a more useful shared form before anybody looks at it. Why is this not being done by clients, with them getting raw events and a touchpad-description file? Possibly because nobody thinks that is a good idea?

I am pretty certain that it would be possible to move the shared part (basically what libxkbcommon does) into libinput, so everybody gets keysyms. It would always produce events even if there is a failure to translate, and the keycode would be in the event, so no information is ever lost, and clients could OPTIONALLY decode the stuff themselves.

The current scheme I think is going to have horrendous annoying bugs when the clients, compositor, and input method get out of sync and disagree about how the keyboard works and what shift state it is in. It also makes it pretty impossible to make a remote wayland to another system using a keyboard that is not installed on the local machine.

On 07/25/2014 06:06 AM, Pekka Paalanen wrote:
On Wed, 23 Jul 2014 09:46:16 +0700
Trung Ngo <[email protected]> wrote:

Hi guys,

In the text protocol, there is a `keysym` event (and a corresponding
`keysym` request in the input-method protocol). In the spec, it is used
to 'notify when a key event was sent.' If I understand correctly then
the whole point of this request/event pair is to fake a key press from
the input method. If so, shouldn't it make more sense to intercept the
keysym request at the compositor and send a key press event to the text
application instead of passing the keysym event to the text application
(no more keysym event in the text protocol)?

In the current design, the text application has to listen to the keysym
event (for fake keys) and implement the key handler (for 'normal' keys)
at the same time, potentially duplicating code and opening up the
posibility that some applications forget to implement the keysym event
handler.

I'm no expert on input, but that is all deliberate.

"Normal keys" are direct user actions, a person pressing a key on a
keyboard. These originate from a real physical action, specifically a
key press/release. These use key codes, which in the clients are
interpreted through libxkbcommon to take account the keymap and
whatnot. If a keymap does not provide some keysym, you cannot input it,
AFAIU.

Input methods however use whatever (even complicated) UI that lets the
user to choose which symbol to enter. For instance, it can be a virtual
keyboard with all kinds of exotic symbols, poked with a finger. There
are no physical keys to press. Or it could be just the "compose key"
mechanism. Or a machine vision system interpreting sign language.

These are fundamentally very different kinds of input. We have a strict
separation in the protocol design between actual physical user actions
(e.g. poking a button) and "fake" events, and we intend to keep these
two separated all the way. For example, wl_pointer.motion is sent only
when the user moves the pointer, not when e.g. the window moves under
the pointer or the pointer gets warped programmatically.

The burden of implementation is in toolkits, not applications.


Thanks,
pq
_______________________________________________
wayland-devel mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

_______________________________________________
wayland-devel mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to