Re: [PATCH 25/36] randr: fixup constrain to work with slave screens.

2012-07-03 Thread Chris Bagwell
On Mon, Jul 2, 2012 at 5:13 AM, Dave Airlie airl...@gmail.com wrote:
 From: Dave Airlie airl...@redhat.com

 Current code constrains the cursor to the crtcs on the master
 device, for slave outputs to work we have to include their crtcs
 in the constrain calculations.

 Signed-off-by: Dave Airlie airl...@redhat.com
 ---
  randr/rrcrtc.c |   57 
 +---
  1 file changed, 46 insertions(+), 11 deletions(-)

 diff --git a/randr/rrcrtc.c b/randr/rrcrtc.c
 index 29b02a9..e5fe059 100644
 --- a/randr/rrcrtc.c
 +++ b/randr/rrcrtc.c
 @@ -1544,18 +1544,10 @@ ProcRRGetCrtcTransform(ClientPtr client)
  return Success;
  }

 -void
 -RRConstrainCursorHarder(DeviceIntPtr pDev, ScreenPtr pScreen, int mode, int 
 *x,
 -int *y)
 +static Bool check_all_screen_crtcs(ScreenPtr pScreen, int *x, int *y)
  {
  rrScrPriv(pScreen);
  int i;
 -
 -/* intentional dead space - let it float */
 -if (pScrPriv-discontiguous)
 -return;
 -
 -/* if we're moving inside a crtc, we're fine */
  for (i = 0; i  pScrPriv-numCrtcs; i++) {
  RRCrtcPtr crtc = pScrPriv-crtcs[i];

 @@ -1567,8 +1559,15 @@ RRConstrainCursorHarder(DeviceIntPtr pDev, ScreenPtr 
 pScreen, int mode, int *x,
  crtc_bounds(crtc, left, right, top, bottom);

  if ((*x = left)  (*x  right)  (*y = top)  (*y  bottom))
 -return;
 +return TRUE;
  }
 +return FALSE;
 +}
 +

I'm hoping I can take advantage of this section being reviewed as its
related to bug #39949.

This new check_all_screen_crtcs()'s is still calling unmodified
crtc_bounds().  That function computes bounds using
crtc-mode-mode.height/width.

If you are scaling display to greater than real size or using panning
then the bounds check causes cursor to get stuck to the smaller
portion of original screen size.

There is a proposed patch in bug report but perhaps this patch series
allows a better interface?  But probably not since I guess its
concerned more with interfaces to those original screen sizes being
hot plugged.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: xf86-input monitor filedescriptors

2012-02-28 Thread Chris Bagwell
On Tue, Feb 28, 2012 at 10:09 AM, David Herrmann dh.herrm...@googlemail.com
 I am writing an input driver for Nintendo Wii Remotes.
 See: http://github.com/dvdhrm/xf86-input-xwiimote



 The devices created for every WiiRemote include:

 Nintendo Wii Remote (core device)
 EV_KEY:
---KEY_LEFT,--/* WIIPROTO_KEY_LEFT */
---KEY_RIGHT,-/* WIIPROTO_KEY_RIGHT */
---KEY_UP,---/* WIIPROTO_KEY_UP */
---KEY_DOWN,--/* WIIPROTO_KEY_DOWN */
---KEY_NEXT,--/* WIIPROTO_KEY_PLUS */
---KEY_PREVIOUS,--/* WIIPROTO_KEY_MINUS */
---BTN_1,/* WIIPROTO_KEY_ONE */
---BTN_2,/* WIIPROTO_KEY_TWO */
---BTN_A,/* WIIPROTO_KEY_A */
---BTN_B,/* WIIPROTO_KEY_B */
---BTN_MODE,--/* WIIPROTO_KEY_HOME */
 Force-Feedback:
---FF_RUMBLE

Out of the box, does this interface get controlled by xf86-input-evdev?

You may still wish to write a xf86-input-wiimote but have you
considered alternatives?  I think for this device, your mostly
interested in remapping the key's.

IR Remote's have same basic issue and push a lot of the problem into
kernel and the EVIOCSKEYCODE ioctl().  Here is a sample user land
application:

http://linuxtv.org/downloads/v4l-dvb-apis/Remote_controllers_table_change.html

You'll probably have to invent scancode concept for your  kernel
driver for this to work.  This old patch from google may be of
interest for HID devices and EVIOSKEYCODE.

http://www.mail-archive.com/linux-usb-devel@lists.sourceforge.net/msg48259.html

 Nintendo Wii Remote IR (IR device)
 If no user-space process has this device opened, the kernel disables
 the IR-cam on the device to save energy.
 EV_ABS:
---ABS_HAT0X
---ABS_HAT0Y
---ABS_HAT1X
---ABS_HAT1Y
---ABS_HAT2X
---ABS_HAT2Y
---ABS_HAT3X
---ABS_HAT3Y
 This tracks up to 4 IR-sources with the IR-cam reported as 2D absolute data.


I think your current X input driver is taking HAT0X/Y and posting them
with little modification on more typical X/Y axis?

I've seen a common problem where hid-core combines 2 HID interfaces
into 1 and devices first X/Y's are reported as X/Y and second X/Y's as
Z/RX.  I've also seen people work around this in user land using the
uinput feature.  They would create a program that opens
/dev/input/event? of touchscreen and listen for ABS_Z/RX events and
then post them to a /dev/uinput as ABS_X/Y values.

Eventually, these converted events came back to user land as a new
/dev/input/event? interface and xf86-input-evdev can then be used
unmodified.  You could do something similar by converting HAT0X/Y to
X/Y axis.

I suspect this option is easier to toggle between mouse mode and
letting the device be owned by some specific application as well.

Google uinput.h for examples.  Sorry, I couldn't quickly find the
touchscreen uinput code I've seen in past.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: xf86-input monitor filedescriptors

2012-02-28 Thread Chris Bagwell
On Tue, Feb 28, 2012 at 12:58 PM, David Herrmann
dh.herrm...@googlemail.com wrote:
 On Tue, Feb 28, 2012 at 6:55 PM, Chris Bagwell ch...@cnpbagwell.com wrote:
 On Tue, Feb 28, 2012 at 10:09 AM, David Herrmann dh.herrm...@googlemail.com
 I am writing an input driver for Nintendo Wii Remotes.
 See: http://github.com/dvdhrm/xf86-input-xwiimote



 The devices created for every WiiRemote include:

 Nintendo Wii Remote (core device)
 EV_KEY:
---KEY_LEFT,--/* WIIPROTO_KEY_LEFT */
---KEY_RIGHT,-/* WIIPROTO_KEY_RIGHT */
---KEY_UP,---/* WIIPROTO_KEY_UP */
---KEY_DOWN,--/* WIIPROTO_KEY_DOWN */
---KEY_NEXT,--/* WIIPROTO_KEY_PLUS */
---KEY_PREVIOUS,--/* WIIPROTO_KEY_MINUS */
---BTN_1,/* WIIPROTO_KEY_ONE */
---BTN_2,/* WIIPROTO_KEY_TWO */
---BTN_A,/* WIIPROTO_KEY_A */
---BTN_B,/* WIIPROTO_KEY_B */
---BTN_MODE,--/* WIIPROTO_KEY_HOME */
 Force-Feedback:
---FF_RUMBLE

 Out of the box, does this interface get controlled by xf86-input-evdev?

 You may still wish to write a xf86-input-wiimote but have you
 considered alternatives?  I think for this device, your mostly
 interested in remapping the key's.

 IR Remote's have same basic issue and push a lot of the problem into
 kernel and the EVIOCSKEYCODE ioctl().  Here is a sample user land
 application:

 http://linuxtv.org/downloads/v4l-dvb-apis/Remote_controllers_table_change.html

 You'll probably have to invent scancode concept for your  kernel
 driver for this to work.  This old patch from google may be of
 interest for HID devices and EVIOSKEYCODE.

 http://www.mail-archive.com/linux-usb-devel@lists.sourceforge.net/msg48259.html

 I am not getting your point here. Do you propose to use EVIOSKEYCODE
 if I want to change the scancode/keycode mapping?

Yes.

 I never understood why this API is used. Consider one application
 reads the input device and I change the mapping just to remap the keys
 for X/evdev. The other application now breaks as the kernel sends
 other events (ABI breakage). Or am I getting something wrong here?
 This may work if X is the only user of the device but then I wonder
 why using the kernel API and not implementing it in X anyway.

I believe intent is only 1 owner of an input at a time so above
wouldn't be a concern then.

I'll only add here that when a kernel driver supports EVIOSKEYCODE,
using /lib/udev/keymaps and /lib/udev/rules.d/95-keymaps to change
them to more reasonable value is very easy for X and console users.


 Nintendo Wii Remote IR (IR device)
 If no user-space process has this device opened, the kernel disables
 the IR-cam on the device to save energy.
 EV_ABS:
---ABS_HAT0X
---ABS_HAT0Y
---ABS_HAT1X
---ABS_HAT1Y
---ABS_HAT2X
---ABS_HAT2Y
---ABS_HAT3X
---ABS_HAT3Y
 This tracks up to 4 IR-sources with the IR-cam reported as 2D absolute data.


 I think your current X input driver is taking HAT0X/Y and posting them
 with little modification on more typical X/Y axis?

 It uses all four HAT?X/Y values to compute one absolute value. It's a
 little more complex so rotations are also detected and so on but I
 think that's not important here.

 I've seen a common problem where hid-core combines 2 HID interfaces
 into 1 and devices first X/Y's are reported as X/Y and second X/Y's as
 Z/RX.  I've also seen people work around this in user land using the
 uinput feature.  They would create a program that opens
 /dev/input/event? of touchscreen and listen for ABS_Z/RX events and
 then post them to a /dev/uinput as ABS_X/Y values.

 As I said earlier, I've already written a daemon which listens for new
 Wii Remotes and opens an uinput device for each of them to emulate the
 desired behavior. It works currently only with buttons but that can be
 extended easily. However, I was concerned about the additional
 round-trips that this introduces as every event is sent back to the
 kernel which then sends it through the uinput interface.

Oh, sorry.  I missed that part.  So you already knew about what I was
suggesting.


 Eventually, these converted events came back to user land as a new
 /dev/input/event? interface and xf86-input-evdev can then be used
 unmodified.  You could do something similar by converting HAT0X/Y to
 X/Y axis.

 I suspect this option is easier to toggle between mouse mode and
 letting the device be owned by some specific application as well.

 Google uinput.h for examples.  Sorry, I couldn't quickly find the
 touchscreen uinput code I've seen in past.

 Chris

 Thanks for the reply. I am getting the feeling that you both recommend
 solving this problem in a separate process and feeding that input into
 the evdev driver. I never worked on the xserver before so I thought
 the recommended way was writing an xf86-input driver.

I think we were just trying to get a feel for what you needed to make
sure it wasn't already solved.  Its certainly valid to write new
xf86-input drivers for unique input

Re: [PATCH synaptics 2/2] The correct maximum values for pressure and finger width

2011-08-25 Thread Chris Bagwell
On Thu, Aug 25, 2011 at 5:22 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 On Thu, Aug 25, 2011 at 09:48:52PM +0600, Alexandr Shadchin wrote:
 Signed-off-by: Alexandr Shadchin alexandr.shadc...@gmail.com
 ---
  src/synaptics.c |    4 ++--
  1 files changed, 2 insertions(+), 2 deletions(-)

 diff --git a/src/synaptics.c b/src/synaptics.c
 index 3c08b18..6918330 100644
 --- a/src/synaptics.c
 +++ b/src/synaptics.c
 @@ -246,7 +246,7 @@ SanitizeDimensions(InputInfoPtr pInfo)
      if (priv-minp = priv-maxp)
      {
       priv-minp = 0;
 -     priv-maxp = 256;
 +     priv-maxp = 255;

       xf86IDrvMsg(pInfo, X_PROBED,
                   invalid pressure range.  defaulting to %d - %d\n,
 @@ -256,7 +256,7 @@ SanitizeDimensions(InputInfoPtr pInfo)
      if (priv-minw = priv-maxw)
      {
       priv-minw = 0;
 -     priv-maxw = 16;
 +     priv-maxw = 15;

       xf86IDrvMsg(pInfo, X_PROBED,
                   invalid finger width range.  defaulting to %d - %d\n,
 --
 1.7.6

 refresh my memory: why maxw = 15? maxp I understand but I don't know where
 this limitation to 15/16 comes from.

A real synaptics pad will only return a max W of 15.  16 is total range.

Since last year, kernel is now reporting minw=0 and maxw=15 and minp=0
and maxp=255 and so xf86-input-synaptics is using these values with
new kernels anyways.  Should be pretty safe since I've not heard
negative reports.

I had made a similar patch last year but never submitted it because it
caused new default values for palm and 2FG emulation and I could only
test on 1 laptop.

Also, I think there is a bug in related code because range of W is 16
but the following computes 15.  With above patch, range becomes 14 and
at such small values its very noticeable in default values computed.
But as I mentioned, new kernels + xf86-input-synaptics are using these
values anyways and no complaints.

range = priv-maxw - priv-minw + 1;

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: evdev: support for touchscreens not providing BTN_TOUCH

2011-05-20 Thread Chris Bagwell
On Fri, May 20, 2011 at 8:08 AM, Peter Korsgaard jac...@sunsite.dk wrote:
 Hi,

 I've recently tested a lumio crystaltouch touchscreen on Linux, and
 xf86-input-evdev unfortunately gets confused and handles it as a mouse.

 It provides the following events:

 Input driver version is 1.0.1
 Input device ID: bus 0x3 vendor 0x202e product 0x5 version 0x111
 Input device name: LUMIO Inc LUMIO CrystalTouch ver 1.1C
 Supported events:
  Event type 0 (Sync)
  Event type 1 (Key)
    Event code 272 (LeftBtn)
    Event code 273 (RightBtn)
    Event code 274 (MiddleBtn)
  Event type 2 (Relative)
    Event code 9 (Misc)
  Event type 3 (Absolute)
    Event code 0 (X)
      Value    650
      Min        0
      Max     4095
    Event code 1 (Y)
      Value   3221
      Min        0
      Max     4095
  Event type 4 (Misc)
    Event code 4 (ScanCode)
 Testing ... (interrupt to exit)
 Event: time 1305882024.934011, type 4 (Misc), code 4 (ScanCode), value 90001
 Event: time 1305882024.934017, type 1 (Key), code 272 (LeftBtn), value 1
 Event: time 1305882024.934029, type 3 (Absolute), code 0 (X), value 270
 Event: time 1305882024.934034, type 3 (Absolute), code 1 (Y), value 1513
 Event: time 1305882024.934039, type 2 (Relative), code 9 (Misc), value 1
 Event: time 1305882024.934043, -- Report Sync 
 Event: time 1305882024.943019, type 2 (Relative), code 9 (Misc), value 1
 Event: time 1305882024.943025, -- Report Sync 
 Event: time 1305882024.951998, type 3 (Absolute), code 0 (X), value 275
 Event: time 1305882024.952006, type 3 (Absolute), code 1 (Y), value 1519
 Event: time 1305882024.952010, type 2 (Relative), code 9 (Misc), value 1

 Whichs leads to evdev configuring it as a mouse as there's no BTN_TOUCH:

 [   563.001] (**) LUMIO Inc LUMIO CrystalTouch ver 1.1C: always reports core 
 events
 [   563.001] (**) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Device: 
 /dev/input/event17
 [   563.020] (--) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Found 3 mouse buttons
 [   563.020] (--) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Found relative axes
 [   563.020] (--) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Found absolute axes
 [   563.020] (--) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Found x and y 
 absolute axes
 [   563.020] (II) LUMIO Inc LUMIO CrystalTouch ver 1.1C: Configuring as mouse
 [   563.020] (**) LUMIO Inc LUMIO CrystalTouch ver 1.1C: YAxisMapping: 
 buttons 4 and 5
 [   563.020] (**) LUMIO Inc LUMIO CrystalTouch ver 1.1C: EmulateWheelButton: 
 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
 [   563.020] (**) Option config_info 
 udev:/sys/devices/pci:00/:00:1d.0/usb2/2-1/2-1.1/2-1.1:1.2/input/input19/event17
 [   563.020] (II) XINPUT: Adding extended input device LUMIO Inc LUMIO 
 CrystalTouch ver 1.1C (type: MOUSE)

 So the absolute X/Y coordinates gets ignored and the REL_MISC event gets
 handled as a relative motion of +1 in X direction - Not quite what we
 want.


I don't think you can do much on xf86-input-evdev side to solve this
issue.  The HW designers attempted to default to something so you
could limp along until a custom driver can take over but they made
some bad choices.

You can argue that xf86-input-evdev should be defaulting to TOUCHPAD
or TOUCHSCREEN anytime it it detects ABS_X and ABS_Y but no REL_X and
REL_Y.  That would get you a little further to usability.  At least
the cursor would move around.

But then the hardware is sending a left button press when you touch
screen instead of a pressure or finger tip indication.  So you can
move the cursor around the screen but you'll constantly be dragging
stuff around or mis-selecting something.  There is nothing that
xf86-input-evdev can do short of adding an optional hack to treat
button press as finger pressure.

So in this case, its really best to get kernel side working right.

Also, its halfheartedly trying to advertise itself as a touchpad so
redirecting to xf86-input-synaptics would almost be an option but that
driver refusing to work with any touchpad that doesn't also support
pressure or tip events; and for good reason.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH synaptics 13/17] Don't autoprobe for devices when Option Device is set.

2011-04-02 Thread Chris Bagwell
On Sun, Mar 20, 2011 at 9:08 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 If only Option Device is set but no protocol, the code calls into
 AutoDevProbe. eventcomm (the only backend with an AutoDevProbe) then runs
 through all /dev/input/event devices and takes the first one it can find.

 If two touchpads are connected on a system, this may cause the same touchpad
 to be added twice and the other one not at all - even though the device path
 is specified. (This can only happen when the event device is not grabbed,
 otherwise the grabcheck prevents the touchpad from being added twice)

 Pass the device option into AutoDevProbe and check that device first. If it
 is a touchpad, finish with success. If it isn't, fail AutoDevProbe.

I think I've stared at this enough to understand most of it.  If
double adds were happening before I think its important not just that
its a touchpad but that its a touchpad that hasn't been previously
grabbed (for 2 touchpad case).  If so, that might be nice to add in
comment somewhere.

Reviewed-by: Chris Bagwell ch...@cnpbagwell.com


 Introduced in dce6006f6a851be4147e16731caa453dd0d1ec1c.

 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 CC: Alexandr Shadchin alexandr.shadc...@gmail.com
 ---
  src/eventcomm.c |   17 -
  src/synaptics.c |    2 +-
  src/synproto.h  |    2 +-
  3 files changed, 18 insertions(+), 3 deletions(-)

 diff --git a/src/eventcomm.c b/src/eventcomm.c
 index 41dd669..d59efdc 100644
 --- a/src/eventcomm.c
 +++ b/src/eventcomm.c
 @@ -504,7 +504,7 @@ EventReadDevDimensions(InputInfoPtr pInfo)
  }

  static Bool
 -EventAutoDevProbe(InputInfoPtr pInfo)
 +EventAutoDevProbe(InputInfoPtr pInfo, const char *device)
  {
     /* We are trying to find the right eventX device or fall back to
        the psaux protocol and the given device from XF86Config */
 @@ -512,6 +512,21 @@ EventAutoDevProbe(InputInfoPtr pInfo)
     Bool touchpad_found = FALSE;
     struct dirent **namelist;

 +    if (device) {
 +       int fd = -1;
 +       SYSCALL(fd = open(device, O_RDONLY));
 +       if (fd = 0)
 +       {
 +           touchpad_found = event_query_is_touchpad(fd, TRUE);
 +
 +           SYSCALL(close(fd));
 +           /* if a device is set and not a touchpad, we must return FALSE.
 +            * Otherwise, we'll add a device that wasn't requested for and
 +            * repeat f5687a6741a19ef3081e7fd83ac55f6df8bcd5c2. */
 +           return touchpad_found;
 +       }
 +    }
 +
     i = scandir(DEV_INPUT_EVENT, namelist, EventDevOnly, alphasort);
     if (i  0) {
                xf86Msg(X_ERROR, Couldn't open %s\n, DEV_INPUT_EVENT);
 diff --git a/src/synaptics.c b/src/synaptics.c
 index 1233917..102a701 100644
 --- a/src/synaptics.c
 +++ b/src/synaptics.c
 @@ -261,7 +261,7 @@ SetDeviceAndProtocol(InputInfoPtr pInfo)
     for (i = 0; protocols[i].name; i++) {
         if ((!device || !proto) 
             protocols[i].proto_ops-AutoDevProbe 
 -            protocols[i].proto_ops-AutoDevProbe(pInfo))
 +            protocols[i].proto_ops-AutoDevProbe(pInfo, device))
             break;
         else if (proto  !strcmp(proto, protocols[i].name))
             break;
 diff --git a/src/synproto.h b/src/synproto.h
 index 251dc84..75f90e4 100644
 --- a/src/synproto.h
 +++ b/src/synproto.h
 @@ -75,7 +75,7 @@ struct SynapticsProtocolOperations {
     Bool (*QueryHardware)(InputInfoPtr pInfo);
     Bool (*ReadHwState)(InputInfoPtr pInfo,
                        struct CommData *comm, struct SynapticsHwState *hwRet);
 -    Bool (*AutoDevProbe)(InputInfoPtr pInfo);
 +    Bool (*AutoDevProbe)(InputInfoPtr pInfo, const char *device);
     void (*ReadDevDimensions)(InputInfoPtr pInfo);
  };

 --
 1.7.4

 ___
 xorg-devel@lists.x.org: X.Org development
 Archives: http://lists.x.org/archives/xorg-devel
 Info: http://lists.x.org/mailman/listinfo/xorg-devel

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH xf86-input-synaptics resend] Revert Default to 2-finger emulation when HW supports it

2011-04-01 Thread Chris Bagwell
On Fri, Apr 1, 2011 at 2:21 PM, Chase Douglas
chase.doug...@canonical.com wrote:
 This changes the default behavior for trackpads that have only pressure
 information to emulate two finger actions. It's been reported that the
 default value is too low and/or that the pressure values may fluctuate
 with environmental factors (temperature, humidity, etc.). When the
 value is wrong, spurious right clicks and scroll events are triggered.

 Fixes: http://bugs.launchpad.net/bugs/742213

 This reverts commit ffa6dc2809734a6aaa690e9133d6761480603a68.

 Signed-off-by: Chase Douglas chase.doug...@canonical.com
 ---

Here is a bug report with some extra information on how it could be
improved greatly but probably not totally fixed.

https://bugs.freedesktop.org/show_bug.cgi?id=32538

A good chunk of users will no longer need to use 2-finger emulation
once they upgrade to 2.6.38 kernels with semi-mt support on synaptics.
 In fact, this code gets disabled with semi-mt devices.  So my
original reason for trying to get it working out of the box for everye
on is not really there.

So I'm OK with reverting it.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH synaptics 10/17] eventcomm: untangle state setting from printing device info

2011-03-26 Thread Chris Bagwell
Reviewed-by: Chris Bagwell ch...@cnpbagwell.com

On Sun, Mar 20, 2011 at 9:08 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  src/eventcomm.c |   97 --
  1 files changed, 50 insertions(+), 47 deletions(-)

 diff --git a/src/eventcomm.c b/src/eventcomm.c
 index f92347a..50b6083 100644
 --- a/src/eventcomm.c
 +++ b/src/eventcomm.c
 @@ -237,22 +237,16 @@ event_query_axis_ranges(InputInfoPtr pInfo)
     struct input_absinfo abs = {0};
     unsigned long absbits[NBITS(ABS_MAX)] = {0};
     unsigned long keybits[NBITS(KEY_MAX)] = {0};
 -    char buf[256];
 +    char buf[256] = {0};
     int rc;

     /* The kernel's fuzziness concept seems a bit weird, but it can more or
      * less be applied as hysteresis directly, i.e. no factor here. */
 -    rc = event_get_abs(pInfo-fd, ABS_X, priv-minx, priv-maxx,
 -                       priv-synpara.hyst_x, priv-resx);
 -    if (rc == 0)
 -       xf86Msg(X_PROBED, %s: x-axis range %d - %d\n, pInfo-name,
 -               priv-minx, priv-maxx);
 -
 -    rc = event_get_abs(pInfo-fd, ABS_Y, priv-miny, priv-maxy,
 -                       priv-synpara.hyst_y, priv-resy);
 -    if (rc == 0)
 -       xf86Msg(X_PROBED, %s: y-axis range %d - %d\n, pInfo-name,
 -               priv-miny, priv-maxy);
 +    event_get_abs(pInfo-fd, ABS_X, priv-minx, priv-maxx,
 +                 priv-synpara.hyst_x, priv-resx);
 +
 +    event_get_abs(pInfo-fd, ABS_Y, priv-miny, priv-maxy,
 +                 priv-synpara.hyst_y, priv-resy);

     priv-has_pressure = FALSE;
     priv-has_width = FALSE;
 @@ -267,54 +261,63 @@ event_query_axis_ranges(InputInfoPtr pInfo)
                strerror(errno));

     if (priv-has_pressure)
 -    {
 -       rc = event_get_abs(pInfo-fd, ABS_PRESSURE,
 -                          priv-minp, priv-maxp,
 -                          NULL, NULL);
 -       if (rc == 0)
 -           xf86Msg(X_PROBED, %s: pressure range %d - %d\n, pInfo-name,
 -                   priv-minp, priv-maxp);
 -    } else
 -       xf86Msg(X_INFO,
 -               %s: device does not report pressure, will use touch data.\n,
 -               pInfo-name);
 +       event_get_abs(pInfo-fd, ABS_PRESSURE, priv-minp, priv-maxp,
 +                     NULL, NULL);

     if (priv-has_width)
 -    {
 -       rc = event_get_abs(pInfo-fd, ABS_TOOL_WIDTH,
 -                          priv-minw, priv-maxw,
 -                          NULL, NULL);
 -       if (rc == 0)
 -           xf86Msg(X_PROBED, %s: finger width range %d - %d\n, pInfo-name,
 -                   abs.minimum, abs.maximum);
 -    }
 +       event_get_abs(pInfo-fd, ABS_TOOL_WIDTH,
 +                     priv-minw, priv-maxw,
 +                     NULL, NULL);

     SYSCALL(rc = ioctl(pInfo-fd, EVIOCGBIT(EV_KEY, sizeof(keybits)), 
 keybits));
     if (rc = 0)
     {
 -       buf[0] = 0;
 -       if ((priv-has_left = (BitIsOn(keybits, BTN_LEFT) != 0)))
 -          strcat(buf,  left);
 -       if ((priv-has_right = (BitIsOn(keybits, BTN_RIGHT) != 0)))
 -          strcat(buf,  right);
 -       if ((priv-has_middle = (BitIsOn(keybits, BTN_MIDDLE) != 0)))
 -          strcat(buf,  middle);
 -       if ((priv-has_double = (BitIsOn(keybits, BTN_TOOL_DOUBLETAP) != 0)))
 -          strcat(buf,  double);
 -       if ((priv-has_triple = (BitIsOn(keybits, BTN_TOOL_TRIPLETAP) != 0)))
 -          strcat(buf,  triple);
 +       priv-has_left = (BitIsOn(keybits, BTN_LEFT) != 0);
 +       priv-has_right = (BitIsOn(keybits, BTN_RIGHT) != 0);
 +       priv-has_middle = (BitIsOn(keybits, BTN_MIDDLE) != 0);
 +       priv-has_double = (BitIsOn(keybits, BTN_TOOL_DOUBLETAP) != 0);
 +       priv-has_triple = (BitIsOn(keybits, BTN_TOOL_TRIPLETAP) != 0);

        if ((BitIsOn(keybits, BTN_0) != 0) ||
            (BitIsOn(keybits, BTN_1) != 0) ||
            (BitIsOn(keybits, BTN_2) != 0) ||
            (BitIsOn(keybits, BTN_3) != 0))
 -       {
            priv-has_scrollbuttons = 1;
 -           strcat(buf,  scroll-buttons);
 -       }
 -
 -       xf86Msg(X_PROBED, %s: buttons:%s\n, pInfo-name, buf);
     }
 +
 +    /* Now print the device information */
 +    xf86Msg(X_PROBED, %s: x-axis range %d - %d\n, pInfo-name,
 +           priv-minx, priv-maxx);
 +    xf86Msg(X_PROBED, %s: y-axis range %d - %d\n, pInfo-name,
 +           priv-miny, priv-maxy);
 +    if (priv-has_pressure)
 +       xf86Msg(X_PROBED, %s: pressure range %d - %d\n, pInfo-name,
 +               priv-minp, priv-maxp);
 +    else
 +       xf86Msg(X_INFO,
 +               %s: device does not report pressure, will use touch data.\n,
 +               pInfo-name);
 +    if (priv-has_width)
 +       xf86Msg(X_PROBED, %s: finger width range %d - %d\n, pInfo-name,
 +               abs.minimum, abs.maximum);
 +    else
 +       xf86Msg(X_INFO,
 +               %s: device does not report finger width.\n, pInfo-name);
 +
 +    if (priv-has_left)
 +       strcat(buf,  left);
 +    if (priv-has_right

Re: [PATCH synaptics 08/17] eventcomm: streamline absinfo retrieval.

2011-03-26 Thread Chris Bagwell
Reviewed-by: Chris Bagwell ch...@cnpbagwell.com

On Sun, Mar 20, 2011 at 9:08 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  src/eventcomm.c |  104 
 ---
  1 files changed, 61 insertions(+), 43 deletions(-)

 diff --git a/src/eventcomm.c b/src/eventcomm.c
 index 287f9de..3f06484 100644
 --- a/src/eventcomm.c
 +++ b/src/eventcomm.c
 @@ -188,6 +188,44 @@ event_query_model(int fd, enum TouchpadModel *model_out)
     return TRUE;
  }

 +/**
 + * Get absinfo information from the given file descriptor for the given
 + * ABS_FOO code and store the information in min, max, fuzz and res.
 + *
 + * @param fd File descriptor to an event device
 + * @param code Event code (e.g. ABS_X)
 + * @param[out] min Minimum axis range
 + * @param[out] max Maximum axis range
 + * @param[out] fuzz Fuzz of this axis. If NULL, fuzz is ignored.
 + * @param[out] res Axis resolution. If NULL or the current kernel does not
 + * support the resolution field, res is ignored
 + *
 + * @return Zero on success, or errno otherwise.
 + */
 +static int
 +event_get_abs(int fd, int code, int *min, int *max, int *fuzz, int *res)
 +{
 +    int rc;
 +    struct input_absinfo abs =  {0};
 +
 +    SYSCALL(rc = ioctl(fd, EVIOCGABS(code), abs));
 +    if (rc  0)
 +       return errno;
 +
 +    *min = abs.minimum;
 +    *max = abs.maximum;
 +    /* We dont trust a zero fuzz as it probably is just a lazy value */
 +    if (fuzz  abs.fuzz  0)
 +       *fuzz = abs.fuzz;
 +#if LINUX_VERSION_CODE  KERNEL_VERSION(2,6,30)
 +    if (res)
 +       *res = abs.resolution;
 +#endif
 +
 +    return 0;
 +}
 +
 +
  /* Query device for axis ranges */
  static void
  event_query_axis_ranges(InputInfoPtr pInfo)
 @@ -199,41 +237,25 @@ event_query_axis_ranges(InputInfoPtr pInfo)
     char buf[256];
     int rc;

 -    SYSCALL(rc = ioctl(pInfo-fd, EVIOCGABS(ABS_X), abs));
 -    if (rc = 0)
 -    {
 +    /* The kernel's fuzziness concept seems a bit weird, but it can more or
 +     * less be applied as hysteresis directly, i.e. no factor here. */
 +    rc = event_get_abs(pInfo-fd, ABS_X, priv-minx, priv-maxx,
 +                       priv-synpara.hyst_x, priv-resx);
 +    if (rc == 0)
        xf86Msg(X_PROBED, %s: x-axis range %d - %d\n, pInfo-name,
 -               abs.minimum, abs.maximum);
 -       priv-minx = abs.minimum;
 -       priv-maxx = abs.maximum;
 -       /* The kernel's fuzziness concept seems a bit weird, but it can more 
 or
 -        * less be applied as hysteresis directly, i.e. no factor here. 
 Though,
 -        * we don't trust a zero fuzz as it probably is just a lazy value. */
 -       if (abs.fuzz  0)
 -           priv-synpara.hyst_x = abs.fuzz;
 -#if LINUX_VERSION_CODE  KERNEL_VERSION(2,6,30)
 -       priv-resx = abs.resolution;
 -#endif
 -    } else
 +               priv-minx, priv-maxx);
 +    else
        xf86Msg(X_ERROR, %s: failed to query axis range (%s)\n, pInfo-name,
 -               strerror(errno));
 +               strerror(rc));

 -    SYSCALL(rc = ioctl(pInfo-fd, EVIOCGABS(ABS_Y), abs));
 -    if (rc = 0)
 -    {
 +    rc = event_get_abs(pInfo-fd, ABS_Y, priv-miny, priv-maxy,
 +                       priv-synpara.hyst_y, priv-resy);
 +    if (rc == 0)
        xf86Msg(X_PROBED, %s: y-axis range %d - %d\n, pInfo-name,
 -               abs.minimum, abs.maximum);
 -       priv-miny = abs.minimum;
 -       priv-maxy = abs.maximum;
 -       /* don't trust a zero fuzz */
 -       if (abs.fuzz  0)
 -           priv-synpara.hyst_y = abs.fuzz;
 -#if LINUX_VERSION_CODE  KERNEL_VERSION(2,6,30)
 -       priv-resy = abs.resolution;
 -#endif
 -    } else
 +               priv-miny, priv-maxy);
 +    else
        xf86Msg(X_ERROR, %s: failed to query axis range (%s)\n, pInfo-name,
 -               strerror(errno));
 +               strerror(rc));

     priv-has_pressure = FALSE;
     priv-has_width = FALSE;
 @@ -249,14 +271,12 @@ event_query_axis_ranges(InputInfoPtr pInfo)

     if (priv-has_pressure)
     {
 -       SYSCALL(rc = ioctl(pInfo-fd, EVIOCGABS(ABS_PRESSURE), abs));
 -       if (rc = 0)
 -       {
 +       rc = event_get_abs(pInfo-fd, ABS_PRESSURE,
 +                          priv-minp, priv-maxp,
 +                          NULL, NULL);
 +       if (rc == 0)
            xf86Msg(X_PROBED, %s: pressure range %d - %d\n, pInfo-name,
 -                   abs.minimum, abs.maximum);
 -           priv-minp = abs.minimum;
 -           priv-maxp = abs.maximum;
 -       }
 +                   priv-minp, priv-maxp);
     } else
        xf86Msg(X_INFO,
                %s: device does not report pressure, will use touch data.\n,
 @@ -264,14 +284,12 @@ event_query_axis_ranges(InputInfoPtr pInfo)

     if (priv-has_width)
     {
 -       SYSCALL(rc = ioctl(pInfo-fd, EVIOCGABS(ABS_TOOL_WIDTH), abs));
 -       if (rc = 0)
 -       {
 +       rc = event_get_abs(pInfo-fd, ABS_TOOL_WIDTH,
 +                          priv-minw, priv-maxw

Re: multitouch and synaptics clickpad questions

2011-03-06 Thread Chris Bagwell
On Sat, Mar 5, 2011 at 4:00 AM, Daniel Kurtz djku...@google.com wrote:
 Back in October Chase Douglas [1] kicked off a flurry of patches and
 discussions on this mailing list about adding Clickpad support, and/or some
 form of multitouch gesture processing (what little is possible with
 Synaptics Advanced Gesture Mode) to the xf86-input-synaptics driver (and/or
 the kernel).  The patches were commented upon, and rehashed a little... but
 now everything seems to have died down - and, as far as I can tell, nothing
 has yet been accepted into xf86-input-synaptics upstream.
 [1] Starting with
 this: http://lists.x.org/archives/xorg-devel/2010-October/013809.html
 As a result of this work, in late December Henrik Rydberg submitted a patch
 [2] to linux-input, based on the discussions above, which added semi-mt
 support to the synaptics kernel driver.  This patch, too, though, has not
 yet been accepted upstream.
 [2] https://patchwork.kernel.org/patch/426561/

 What is the status of the kernel driver patches?

Henrik's semi-mt patch has been submitted and I believe it will be in
kernel 2.6.38.  It is only indirectly related to clickpads though.  It
allows 2 fingers worth of data to be reported to userspace; which is
an improvement over having to emulate 2 fingers based on finger width.

Chase had some additional kernel patches to try and improve
clickpad-specific behaviour on kernel side but those were scrapped.
It is better to do in user land but semi-mt reports are a prerequisite
for that.

 What is the status of the effort to use them from user space?


 Are patches still being reviewed/tested/worked on for xf86-input-synaptics
 to use Synaptics AGM to improve clickpad performance?

I do not believe anyone is working on it now.  Some patches were
submitted to this list around January (I think) to ignore bottom 20%
of clickpad to allow for clickpad button presses without cursor
movement and I think versions of that patch exist in multiple
distributions.  I believe that approach was basically rejected for
inclusion in upstream.  Its not how windows or macs work for clickpads
and more of an easy hack.

No one is working on ideal solution that I've heard of (see
description below).

 Was a decision made to not bother with xf86-input-synaptics, and instead to
 focus only on mtdev  xf86-input-multitouch?

For clickpads, I believe its just no body working on it but seems
reasonable to add to xf86-input-synaptics.  It would help move things
along if someone donated a clickpad to maintainer of
xf86-input-synaptics.

 Is semi-mt supported in mtdev?

No but to be clear semi-mt isn't about clickpads.  Semi-mt is about
rough way to report 2 finger coordinates to user land.  Even if mtdev
adds support for semi-mt, it will not automatically mean clickpads are
working.

What clickpads need is unique gesture logic that turns what normally
would be a 2-finger scroll gesture into a 1 finger movement
non-gesture when it sees button clicked.  There is some finger
tracking logic needed as well to decide which of 4 rectangle points in
semi-mt report is the corner related to movement.

The whole Xinput 2.1 stuff is in active development.  Its hard to
answer were this clickpad logic should be implemented.  Today it would
be in xf86-input-synaptics but I'm not sure if it would need to move
to something like utouch in the future.  Its kinda a special case and
perhaps will always stay in xf86-input-synaptics.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH synaptics 0/4] synaptics cleanup patches

2011-02-20 Thread Chris Bagwell
For the series:

Reviewed-by: Chris Bagwell ch...@cnpbagwell.com


On Wed, Feb 16, 2011 at 7:19 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:

 Just a few cleanup patches I had in my tree for a while. Shouldn't change
 the driver behaviour, at least I didn't notice anything yet.

 Cheers,
  Peter
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: EVIOC mechanism for MT slots

2011-01-20 Thread Chris Bagwell
On Thu, Jan 20, 2011 at 7:45 PM, Ping Cheng pingli...@gmail.com wrote:
 Hi Dmitry,

 Rafi's request is a good use case for the input: mt: Add EVIOC
 mechanism for MT slots patchset that Henrik submitted last May. From
 the MT X driver experience we had in the last few months, retrieving
 all active contacts, especially in the case when different tool types
 are supported on the same logical port, is necessary to initialize the
 tools properly.

 Can you consider to merge the patchset into 2.6.38?

 Thank you.

 Ping

Agree.  Although X/Y will change often, the tracking ID is stable.  So
during start up of applications it would be really useful to query
pre-existing ABS_MT_TRACKING_ID so that user doesn't have to lift
object/hand/whatever before application starts working.

Chris


  Original Message 
 Subject:        Re: [PATCH 0/2] input: mt: Add EVIOC mechanism for MT slots
 Date:   Thu, 27 May 2010 16:12:20 -0700
 From:   Dmitry Torokhov dmitry.torok...@gmail.com
 To:     Ping Cheng pingli...@gmail.com
 CC:     Henrik Rydberg rydb...@euromail.se, Andrew Morton
 a...@linux-foundation.org, linux-in...@vger.kernel.org,
 linux-ker...@vger.kernel.org, Mika Kuoppala mika.kuopp...@nokia.com,
 Peter Hutterer peter.hutte...@who-t.net, Benjamin Tissoires
 tisso...@cena.fr, Stephane Chatty cha...@enac.fr, Rafi Rubin
 r...@seas.upenn.edu, Michael Poole mdpo...@troilus.org


 On Thursday, May 27, 2010 03:59:37 pm Ping Cheng wrote:

     On Thu, May 27, 2010 at 12:03 AM, Dmitry Torokhov

     dmitry.torok...@gmail.com  wrote:
       On Wed, May 26, 2010 at 08:59:35AM -0700, Ping Cheng wrote:
       On Tue, May 25, 2010 at 1:23 PM, Dmitry Torokhov
     
       dmitry.torok...@gmail.com  wrote:
         On Tue, May 25, 2010 at 09:52:29PM +0200, Henrik Rydberg wrote:
         Dmitry Torokhov wrote:
           Hi Henrik,
         
           On Tue, May 25, 2010 at 01:52:57PM +0200, Henrik Rydberg wrote:
           These patches are in response to the discussion about
 input state
           retrieval.
         
           The current EVIOCGABS method does not work with MT
 slots.  These
           patches provides a mechanism where a slot is first
 selected via a
           call to EVIOCSABS, after which the corresponding MT
 events can be
           extracted with calls to EVIOCGABS.
         
           The symmetric operation, to set the MT state via
 EVIOCSABS, seems
           to violate input data integrity, and is therefore not
           implemented.
         
           This looks sane, however the question remains - is
 there any users
           for this data? Like I mentioned, I can see the need to
 fetch state
           of switches and ranges of absolute axis, and even 
 non-multitouch
           ABS values (due to the fact that some input devices,
 like sliders,
           may stay in a certain position for long periods of time), but I
           expect multitouch data to be refreshed very quickly.
         
           Thanks.
       
         There were some voices addressing this issue, and the patches are
         here, available for whomever to pick up. Drop them if you wish, I
         will not send them anew.
       
         I'll save them in my queue but will hold off applying until I hear
         userspace folks requesting such functionality.
     
       Hi Dmitry,
     
       You do have a valid point - the (x,y) from a touch object would most
       likely change all the time. Even if the object itself is in a steady
       state on the digitizer, i.e., without any intentional movement, the
       electronic noise would most likely lead to some (x,y) changes. So, the
       chance that we need to retrieve (x,y) is rare.
     
       However, it is possibe that when X driver starts, an object was
       already on the digitizer. And the digitizer is of such a high quality
     
       :), it filtered all the noises so we can not locate the touch without
     
       a EVIOCGABS call.
     
       Plus, from a pure coding/development point of view, it is not a bad
       practice to provide the equivalent features for _MT_ support as we did
       for the existing input devices. At least, it doesn't hurt to make the
       support consistent across devices/tools (considering touch as a new
       input device/tool).
     
       Ping,
     
       I did not say that there was a problem with the patch, I agree with it.
       However if no one using this - why should we bother? Will _you_ utilize
       this functionality in Wacom X driver? If so let me know and I will 
 merge
       it.

     tbh, I can not say that I will need it in my X driver for sure. But I
     vote for it to be merged.


 Well, at this point I am in no users - no functionality mode, so I will
 only count votes of users :P

 --
 Dmitry

 On Thu, Jan 20, 2011 at 1:10 PM, Rafi Rubin r...@seas.upenn.edu wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 We've come across a little problem with filtered 

Re: [PATCH] ClickPad support v4

2011-01-08 Thread Chris Bagwell
On Fri, Jan 7, 2011 at 8:19 PM, AIC a...@hi.t-com.hr wrote:
 Chris Bagwell ch...@... writes:

 Synaptics Touchpad, model: 1, fw: 7.4,
  id: 0x1e0b1, caps: 0xd04731/0xe4/0x5a0400

 Can you clarify what you mean about not detecting?  Do you mean not
 detected as clickpad?

 Yes, I have read somewhere they identify as Synaptics Clickpad model... I 
 was
 afraid I was missing out on something. Thanks.

 Can you confirm which version of xf86-input-synaptics your testing
 with?  It must be at least 1.3.0 or newer for EmulateTwoFingerMinW to
 be be useful for these type jumps.

 1.3.0 + Yan's patch. I have not yet checked git log of synaptics driver to see
 if any improvements/fixes have been merged recently.

 I forgot to specify this then affects any other click action - the most basic
 desktop action like copy/paste is almost impossible, dragging the window by 
 the
 titlebar as well. Both are actions where you keep the left button pressed 
 while
 you're dragging. I will try to include those in evtest as well.

 If you continue to have issues, I would appreciate if you can use
 evtest tool and record events to log file while your performing

 I will. Thank you.


Thanks for sending the log files to me.  I'll summarize what I saw for
lists benefit.

Your sample gesture logs showed many cases were EmulateTwoFingerMinW=5
will help.

In addition, you've shown me examples of this clickpad reporting what
I'll call phantom touches.  These are readily visible when I was
playing with enhanced gesture mode (semi-multi-touch mode) of
synaptics pad but I've not seen them before while not in this mode.

Short description of phantom touches: if you consider your 2 finger
touches as 2 corners of a rectangle, the firmware will sometimes get
confused and report coordinates related to other two corners of
rectangle your not touching.  The phantom touches are pretty easy to
see if you put or move your 2 fingers on to same X plane.  I'm
guessing you may be doing this during your button clicks.

Anyways, there is not much xf86-input-synaptics can do to deal with
this with the little information kernel currently provides.  The
jumpy cursor patch will help but not solve it totally.

There is high chance kernel 2.6.38 will have a patch that enables
enhance gesture mode.  Once that occurs, there will finally be
enough information that xf86-input-synaptics can be enhanced to deal
with these phantom touches and thus stop your special cursor jumps.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2011-01-06 Thread Chris Bagwell
On Wed, Jan 5, 2011 at 9:20 PM, AIC a...@hi.t-com.hr wrote:
 Kevin O'Connor ke...@... writes:

 Setting AreaBottomEdge=3800 did make the mouse jitter on button
 press go away (at the cost of a smaller touchpad area).  (Of course,
 this would still need to work once left/right/middle button
 detection is implemented.)

 Hello,

 interesting dicussion. I own another HP laptop, the 4320s, with a
 ClickPad.  Situation is pretty bad on that one. Patch from Yan Li
 enables the right and middle click (thank you). But never solves the
 jump issue, nor does the Add-new-option-JumpyCursorThreshold-v5.patch

 If the synaptics driver in the kernel is supposed to recognize it as
 such, it does not in this case: Synaptics Touchpad, model: 1, fw: 7.4,
 id: 0x1e0b1, caps: 0xd04731/0xe4/0x5a0400

Can you clarify what you mean about not detecting?  Do you mean not
detected as clickpad?

Your extended capabilities 0x0c show bit 0x10 set which current
kernel uses as part of clickpad detection.  Also, if Yan's patch is of
use to you, then that means kernel is only reporting BTN_LEFT which
also indicates its detected as clickpad by kernel.


 I've been looking forward to Takashi's fixes, but barely none are in
 the kernel yet. I had a chance to use SuSE and the pad was OK, jumpy
 though, also the LED to disable it in the top left corner was working.


 I tried to tweak just about every setting, from palm detection,
 reducing the area etc. Nothing helped. If I move the thumb in the
 position for a click and a finger is already on the touchpad these
 mostly horizontal jumps occur. If I swipe the thumb into position it
 moves across the screen in a huge jump. It doesn't seem to be
 connected to horizontal scrolling in any way, I also disabled it.

Make sure palm detection is off or at least palm width value is higher
than value of EmulateTwoFingerMinW as they compete with each other.
EmulateTwoFingerMinW is the extremely important property for the types
of jumps your describing here.  If you can't get that working then
jumps will occur.

Can you confirm which version of xf86-input-synaptics your testing
with?  It must be at least 1.3.0 or newer for EmulateTwoFingerMinW to
be be useful for these type jumps.


 I can also confirm that setting EmulateTwoFingerMinW=5

 The jump while clicking still occures as ever. With any value. :(

If you continue to have issues, I would appreciate if you can use
evtest tool and record events to log file while your performing
sequences that cause jumps and send logs directly to me.  From those I
can verify if its a configuration issue of xf86-input-synaptics or
something that just hasn't been accounted for yet in
xf86-input-synaptics.

Thanks,
Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-28 Thread Chris Bagwell
On Tue, Dec 28, 2010 at 8:54 PM, Yan Li yan.i...@intel.com wrote:
 On Tue, 2010-12-28 at 02:18 +0800, Chris Bagwell wrote:
 On Sat, Dec 18, 2010 at 9:50 PM, Matt Rogers ma...@kde.org wrote:
  This is the use case that I prefer, and at least for me, the one causing 
  the
  most issues. The fact that I can't use it in this way right now drives me
  nuts. :)

 I've been thinking about this one and it looks like we will need a
 gesture delay in xf86-input-wacom soon.  If two finger touch is
 detected with in X ms of initial touch then allow current 2 finger
 scroll logic.  If its more than X ms apart (at least on clickpads)
 then assume its a click-and-drag and disable detection of 2 finger
 scroll.

 Sorry, why wacom? Shouldn't it be xf86-input-synaptics?

Opps, I meant xf86-input-synaptics.  I often work on xf86-input-wacom
so must have had on mind.


 Can you give a little info on issue your seeing?  Is it because it
 enters 2-finger scroll mode or is it cursor jump?  If cursor jump, can
 you confirm your using xf86-input-synaptics 1.3.0 or later (which has
 something to address cursor jump for known cases)?

 I have tested a vanilla xf86-input-synaptics 1.3.0 driver and it doesn't
 prevent the jumpy issue because this ClickPad used in Lenovo IdeaPad
 S10-3t doesn't support detecting of 2-finger nor finger width. So I
 suspect the touchpad just sends out X/Y of either 1st or 2nd finger
 randomly, thus caused the jumpy cursor.

 I'll try to catch the event by using evtest later. But the symptom is
 just like you put two fingers onto any touchpad that can't support
 2-finger nor finger width.

I think you answered my bigger question in your other email.  I'm not
so surprised that clickpad is reporting fingers jumping between 2
fingers but I am surprised if that same hardware doesn't report either
finger width or finger count to application so that it can account for
the transitions.

If your clickpad doesn't report either of those two things then I'm
more interested in figuring out why it doesn't do that then in seeing
evtest output.

BTW: if your clickpad works like HP Mini's then setting
EmulateTwoFingerMinW to 5 should help you out... but it sounds like it
doesn't.

Thanks,
Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-28 Thread Chris Bagwell
On Tue, Dec 28, 2010 at 9:17 PM, Yan Li yan.i...@intel.com wrote:
 On Tue, 2010-12-28 at 05:04 +0800, Chris Bagwell wrote:
 Since xf86-input-synaptics 1.3.0 handles (X,Y) jumps during finger
 transitions, if clickpad users set their EmulateTwoFingerMinW to 5 I
 suspect they will see a lot less big cursor jumps... but they will
 still see small cursor movement click operations.  Users could make
 use of AreaBottomEdge to shrink size of touchpad to work around the
 smaller movement.

 Thanks for the detailed analysis. They match what I'm thinking about.

 Are you sure after limiting AreaBottomEdge the driver can still detect
 X/Y from the buttom area, which is needed for deciding whether it's a
 left or right click? I'm reworking my ClickPad patch to remove the
 border control logic.


Well, the current Area*Edge has at least one issue.  The main issue is
it uses brute force to reset states which will confuse 2 finger usage
sometimes.  Some how numFingers needs to be maintained better.

While thats being done, I'm guessing we can account for clickpads by
keeping X/Y values around.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-27 Thread Chris Bagwell
On Sat, Dec 18, 2010 at 9:50 PM, Matt Rogers ma...@kde.org wrote:
 On Sat, Dec 18, 2010 at 11:38 AM, Chris Bagwell ch...@cnpbagwell.com

 Thank for detailed reply.  Do you mind helping me understand how
 touchpad is being used when jumps occur?  Is use case:

 * Move cursor to area you want to click with 1 finger.  Pick up 1
 finger.  Click in button area with 1 finger.

 or

 * Move cursor to area you want to cick with 1 finger.  Leave 1 finger
 on pad.  Click in button area with 2nd finger.


 This is the use case that I prefer, and at least for me, the one causing the
 most issues. The fact that I can't use it in this way right now drives me
 nuts. :)

I've been thinking about this one and it looks like we will need a
gesture delay in xf86-input-wacom soon.  If two finger touch is
detected with in X ms of initial touch then allow current 2 finger
scroll logic.  If its more than X ms apart (at least on clickpads)
then assume its a click-and-drag and disable detection of 2 finger
scroll.

Can you give a little info on issue your seeing?  Is it because it
enters 2-finger scroll mode or is it cursor jump?  If cursor jump, can
you confirm your using xf86-input-synaptics 1.3.0 or later (which has
something to address cursor jump for known cases)?

Once the delay concept is there, it also helps 1 finger use case.  For
clickpads, we can enable Area*Edge filtering for X ms only to debounce
button clicks which then allows whole pad to still be used as long as
they touch longer then X ms and no button is currently pressed.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-27 Thread Chris Bagwell
On Sat, Dec 25, 2010 at 12:48 PM, Kevin O'Connor ke...@koconnor.net wrote:

 Hi Chris,

 I noticed this message on the xorg-devel mailing list:

 Thank for detailed reply.  Do you mind helping me understand how
 touchpad is being used when jumps occur?  Is use case:

 * Move cursor to area you want to click with 1 finger.  Pick up 1
 finger.  Click in button area with 1 finger.

 or

 * Move cursor to area you want to cick with 1 finger.  Leave 1 finger
 on pad.  Click in button area with 2nd finger.

 First case would move cursor a little bit because of how sensitive
 touchpads are and lots of X/Y can be sent between time of touch and
 button press completed.  But it should not be large jump.  We may need
 to add a gesture style delay when touching in button area so we can
 tell difference between button press and real movement.

 I have an hp-mini notebook with a clickpad device.  The problem I most
 run into is the first issue - when applying pressure to the pad to
 active the click button the cursor will move slightly making it
 difficult to click small buttons.  However, I rarely attempt to click
 with a second finger.

 It would be nice to be able to use the full touchpad area for
 movements, but if I had to trade between jumpy button presses or
 smaller touchpad area, I'd choose having a smaller touchpad area.

 Second case seems more likely to cause a jump but a) synpatics
 hardware continues to report 1st finger's X/Y during double touch and
 b) we have code to expect a jump just in case during that second touch
 based on either doubletap or finger width.  Are you working with
 non-synaptics clickpads?

 Is there any chance you could reproduce basic sequence when jumps
 occur but using evtest and send me the output?

 This is one of those problems really buggy me for some reason and I'd
 like to help resolve it.

 I'm not familiar with xorg development, but I am happy to help.  If
 you're still interested in this info I'll see if I can get evtest
 running.

Thank you, Kevin, for sending me the log files.  It is a small sample
of gestures but I think gives pretty accurate picture.  I'll summarize
here for everyone.

This HP-mini clickpad behaves different then my non-clickpad and isn't
accounted for in xf86-input-synaptics.

* There are between 5-10 sync reports with X/Y updates both before and
after the BTN_LEFT report comes.  We currently ignore first 3 reports
after touch but thats not quite enough.  We do nothing upon BTN_LEFT
release.  We will need either a time based or sample count based hold
off to account for those so people don't get relatively small but
annoying cursor movements while clicking.

* The clickpad seems to only be reporting finger widths of 4 during 1
touch and 5 during 2 touches.  My touchpad does 4 for 1 touch but
gives a much wider range for 2 touch.  Since the default of 2-finger
emulation is W of 6, that logic is never kicking in.

* The (X,Y) finger tracking is not like synaptics hardware docs say.
My touchpad aligns with their specs which say (X,Y) always tracks
first finger touch.  The clickpad seems to change tracking to whatever
makes most sense and seems to do the sane thing of only changing at
transitions of width 4/5 values.

Since xf86-input-synaptics 1.3.0 handles (X,Y) jumps during finger
transitions, if clickpad users set their EmulateTwoFingerMinW to 5 I
suspect they will see a lot less big cursor jumps... but they will
still see small cursor movement click operations.  Users could make
use of AreaBottomEdge to shrink size of touchpad to work around the
smaller movement.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH evdev 3/3] Add use_proximity bit for BTN_TOOL handling.

2010-12-20 Thread Chris Bagwell
I ack on patch concept.  Its easy to see this invalid data on
synaptics hardware and needs to be accounted for.

I'm not up on current proximity support in evdev but patch makes sense
overall.  Just two comments to consider.

1) Will BTN_TOUCH always be sent when you need it to?  I'm wondering
if an if() is needed at BTN_TOOL_FINGER to prevent in_proximity from
being set in first place.  The below is needed to re-turn it back on.

2) Touchscreens have same concept.  Until BTN_TOUCH, it would be
safest to ignore the reported X/Y values.  Of course, majority of
touchscreens do not send BTN_TOOL_FINGER right now but that will
change I think.

Chris

On Mon, Dec 20, 2010 at 7:21 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 Touchpads send garbage data between BTN_TOOL_FINGER and BTN_TOUCH. This
 leads to cursor movement towards invalid positions (bottom left corner,
 usually).

 Add a new flag use_proximity as a delimiter for BTN_TOUCH handling. If
 unset, the actual proximity bits are ignored, no proximity events are sent
 and BTN_TOUCH is used for the tool handling.

 Example event stream for synaptics:

 Event: time 1292893041.002731, -- Report Sync 
 Event: time 1292893041.015807, type 1 (Key), code 330 (Touch), value 0
 Event: time 1292893041.015812, type 3 (Absolute), code 0 (X), value 4283
 Event: time 1292893041.015813, type 3 (Absolute), code 1 (Y), value 4860
 Event: time 1292893041.015815, type 3 (Absolute), code 24 (Pressure), value 23
 Event: time 1292893041.015817, type 3 (Absolute), code 28 (Tool Width), value 
 5
 Event: time 1292893041.027537, -- Report Sync 
 Event: time 1292893041.038854, type 3 (Absolute), code 0 (X), value 1
 Event: time 1292893041.038857, type 3 (Absolute), code 1 (Y), value 5855
 Event: time 1292893041.038859, type 3 (Absolute), code 24 (Pressure), value 1
 Event: time 1292893041.038861, type 3 (Absolute), code 28 (Tool Width), value 
 5
 Event: time 1292893041.038864, -- Report Sync 
 Event: time 1292893041.062432, type 3 (Absolute), code 24 (Pressure), value 0
 Event: time 1292893041.062435, type 3 (Absolute), code 28 (Tool Width), value  0
 Event: time 1292893041.062437, type 1 (Key), code 325 (ToolFinger), value 0
 Event: time 1292893041.062438, -- Report Sync 

 Reported-by: Dave Airlie airl...@redhat.com
 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  src/evdev.c |   13 -
  src/evdev.h |    1 +
  2 files changed, 13 insertions(+), 1 deletions(-)

 diff --git a/src/evdev.c b/src/evdev.c
 index b6591ce..50847a8 100644
 --- a/src/evdev.c
 +++ b/src/evdev.c
 @@ -486,6 +486,9 @@ EvdevProcessProximityEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
  {
     EvdevPtr pEvdev = pInfo-private;

 +    if (!pEvdev-use_proximity)
 +        return;
 +
     pEvdev-prox_queued = 1;

     EvdevQueueProximityEvent(pInfo, ev-value);
 @@ -679,7 +682,10 @@ EvdevProcessKeyEvent(InputInfoPtr pInfo, struct 
 input_event *ev)

     switch (ev-code) {
         case BTN_TOUCH:
 -            pEvdev-in_proximity = value ? ev-code : 0;
 +            /* For devices that have but don't use proximity, use
 +             * BTN_TOUCH as the proximity notifier */
 +            if (!pEvdev-use_proximity)
 +                pEvdev-in_proximity = value ? ev-code : 0;
             if (!(pEvdev-flags  (EVDEV_TOUCHSCREEN | EVDEV_TABLET)))
                 break;
             /* Treat BTN_TOUCH from devices that only have BTN_TOUCH as
 @@ -1346,6 +1352,9 @@ EvdevAddAbsClass(DeviceIntPtr device)

     for (i = 0; i  ArrayLength(proximity_bits); i++)
     {
 +        if (!pEvdev-use_proximity)
 +            break;
 +
         if (TestBit(proximity_bits[i], pEvdev-key_bitmask))
         {
             InitProximityClassDeviceStruct(device);
 @@ -2039,6 +2048,7 @@ EvdevProbe(InputInfoPtr pInfo)
        if (pEvdev-flags  EVDEV_TOUCHPAD) {
            xf86Msg(X_INFO, %s: Configuring as touchpad\n, pInfo-name);
            pInfo-type_name = XI_TOUCHPAD;
 +           pEvdev-use_proximity = 0;
        } else if (pEvdev-flags  EVDEV_TABLET) {
            xf86Msg(X_INFO, %s: Configuring as tablet\n, pInfo-name);
            pInfo-type_name = XI_TABLET;
 @@ -2205,6 +2215,7 @@ EvdevPreInit(InputDriverPtr drv, InputInfoPtr pInfo, 
 int flags)
      * proximity will still report events.
      */
     pEvdev-in_proximity = 1;
 +    pEvdev-use_proximity = 1;

     /* Grabbing the event device stops in-kernel event forwarding. In other
        words, it disables rfkill and the Macintosh mouse button emulation.
 diff --git a/src/evdev.h b/src/evdev.h
 index b04f961..f640fdd 100644
 --- a/src/evdev.h
 +++ b/src/evdev.h
 @@ -126,6 +126,7 @@ typedef struct {

     int flags;
     int in_proximity;           /* device in proximity */
 +    int use_proximity;          /* using the proximity bit? */
     int num_buttons;            /* number of buttons */
     BOOL swap_axes;
     BOOL invert_x;
 --
 

Re: [PATCH evdev 3/3] Add use_proximity bit for BTN_TOOL handling.

2010-12-20 Thread Chris Bagwell
On Mon, Dec 20, 2010 at 9:28 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 On Mon, 20 Dec 2010 20:38:30 -0600, Chris Bagwell ch...@cnpbagwell.com 
 wrote:
 I ack on patch concept.  Its easy to see this invalid data on
 synaptics hardware and needs to be accounted for.

 I'm not up on current proximity support in evdev but patch makes sense
 overall.  Just two comments to consider.

 1) Will BTN_TOUCH always be sent when you need it to?  I'm wondering
 if an if() is needed at BTN_TOOL_FINGER to prevent in_proximity from
 being set in first place.  The below is needed to re-turn it back on.

 not sure I fully understood the question, but:
 in_proximity is set to 1 by default intentionally (there's a comment in
 the code but it's cut off by the context). it's so that proximity is
 always set for those devices that don't report proximity. this way we
 ensure that data coming from the device is posted.

What I meant is that touchpad starts out at BTN_TOOL_FINGER=0 and
BTN_TOUCH=0.  Lets say user is hovering slightly so X/Y values are
sent but not enough to send BTN_TOUCH=1.

Since BTN_TOUCH=0 when driver started, you'll never get event to
trigger in_proximity=0 setting, right?  Maybe this is only an issue on
device start up.

If I recall correctly, most these invalid events occur when going from
touch to no touch.  So probably what I'm worrying about is not an
issue in the wild.

Anyways:

Reviewed-by: Chris Bagwell ch...@cnpbagwell.com
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-18 Thread Chris Bagwell
On Sat, Dec 18, 2010 at 7:17 AM, Yan Li yan.i...@intel.com wrote:
 On Sat, 2010-12-18 at 03:18 +0800, Chris Bagwell wrote:
 OK, I've re-reviewed patch and I've decided I understand what its
 trying to do now.  Most my original comments still apply but I've
 added new ones.

 First, I need to confirm intent of patch is this:

 * Create a rectangle defined by {Top|Bottom|Left|Right}Edge that
 excludes button area in attempt to cause cursor not to move when in
 that area.

 This was indeed the intent of Iwai's patch, on which my v4 was based.
 However, based on my recent testing on several different models of
 touchpad, I think that was not a best solution. Because the design goal
 of ClickPad is to remove the physical buttons so that the space used by
 them can be saved, and touchpad can be enlarged on small netbooks with
 very limited surface space. Therefore it was wrong on the software side
 to limit the area a user can touch, because this was against the
 original idea of using a clickable touchpad. With this patch, the
 touchable area was limited to a very small region, not so good a user
 experience.

 I've tested the official driver from Synaptics in Windows, and it
 doesn't restrict the touchable area, which means the whole pad is
 touchable and clickable. The problem why we chose the current solution
 was actually due to jumpy cursor -- when the user is touching the pad to
 move the course and at the same time use another finger to click the
 lower clickable area, the cursor would jump unexpectedly. The old
 solution used in this patch was to limit the touchable area and ignore
 abs sent from button areas. But since then I have shifted my focus from
 this old solution to fix the jumpy cursor problem instead.

 I've carefully examined the jumpy cursor problem found in Lenovo S10-3t,
 whose touchpad doesn't support two-finger nor finger-width. Finally I
 found Alberto Milone's patch from bug #21614 is the best solution, and
 I've ported it to latest HEAD:
 https://bugs.freedesktop.org/attachment.cgi?id=40902

 So I suggest we rework this ClickPad patch, keep only the clicking
 interpretation part and remove the area limit, and try to fix the jumpy
 cursor problem (and I'm using JumpyCursorThreshold patch v5 I linked
 above in MeeGo, so far the feedback is very good).


Thank for detailed reply.  Do you mind helping me understand how
touchpad is being used when jumps occur?  Is use case:

* Move cursor to area you want to click with 1 finger.  Pick up 1
finger.  Click in button area with 1 finger.

or

* Move cursor to area you want to cick with 1 finger.  Leave 1 finger
on pad.  Click in button area with 2nd finger.

First case would move cursor a little bit because of how sensitive
touchpads are and lots of X/Y can be sent between time of touch and
button press completed.  But it should not be large jump.  We may need
to add a gesture style delay when touching in button area so we can
tell difference between button press and real movement.

Second case seems more likely to cause a jump but a) synpatics
hardware continues to report 1st finger's X/Y during double touch and
b) we have code to expect a jump just in case during that second touch
based on either doubletap or finger width.  Are you working with
non-synaptics clickpads?

Is there any chance you could reproduce basic sequence when jumps
occur but using evtest and send me the output?

This is one of those problems really buggy me for some reason and I'd
like to help resolve it.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH] ClickPad support v4

2010-12-17 Thread Chris Bagwell
OK, I've re-reviewed patch and I've decided I understand what its
trying to do now.  Most my original comments still apply but I've
added new ones.

First, I need to confirm intent of patch is this:

* Create a rectangle defined by {Top|Bottom|Left|Right}Edge that
excludes button area in attempt to cause cursor not to move when in
that area.

* Allow edge scrolling, circular scrolling, and tap gestures; but not
cursor movement; to work even when outside this smaller area.

Can you confirm this is intent?  Once I know then I can provide better
feedback and maybe even re-send updated patches.

We have Area*Edge defined to reduce touchpad area for movement and
gestures but it sounds like clickpads may need  a new setting to
define area that stops only movement but allows gestures?

The definition of {Top|Bottom|Left|Right}Edge is a little fuzzy to me.
 Maybe its OK for movement only control.  Peter, what do you think?

More below.

On Wed, Dec 8, 2010 at 1:55 AM, Yan Li yan.i...@intel.com wrote:
 This patch adds the support for Synaptics Clickpad devices.
 It requires the change in Linux kernel synaptics input driver, found in
    https://patchwork.kernel.org/patch/92435/
 The kernel patch is already included in 2.6.34 and later releases.

 When the kernel driver sets only the left-button bit evbit and no
 multi-finger is possible, Clickpad mode is activated.  In this mode,
 the bottom touch area is used as button emulations.  Clicking at the
 bottom-left, bottom-center and bottom-right zone corresponds to a left,
 center and right click.

 v2-v3: Fix the mis-detection of Clickpad device with double-tap feature
        (e.g. MacBook)
        Fix one forgotten spacing issue Peter suggested

 v3-v4: Ported to HEAD by Yan Li for MeeGo, also added ClickPad
        description to man page.

 Signed-off-by: Takashi Iwai tiwai at suse.de
 Signed-off-by: Yan Li yan.i...@intel.com
 ---
  man/synaptics.man  |    8 ++
  src/eventcomm.c    |    7 +
  src/synaptics.c    |   72 
 +++-
  src/synapticsstr.h |    2 +
  4 files changed, 88 insertions(+), 1 deletions(-)

 diff --git a/man/synaptics.man b/man/synaptics.man
 index 3f1ca9d..25f1115 100644
 --- a/man/synaptics.man
 +++ b/man/synaptics.man
 @@ -56,6 +56,14 @@ Pressure dependent motion speed.
  .IP \(bu 4
  Run-time configuration using shared memory. This means you can change
  parameter settings without restarting the X server.
 +.IP \(bu 4
 +Synaptics ClickPad support: ClickPad is a new kind of device from
 +Synaptics that has no visible physical keys. Instead, the whole board
 +is clickable and the device sends out BTN_MIDDLE only. It's the
 +driver's duty to judge whether the click is a left or right one
 +according to finger location. If the driver detects that the touchpad
 +has only one button, the ClickPad mode will be activated and handles
 +the action correctly.
  .LP
  Note that depending on the touchpad firmware, some of these features
  might be available even without using the synaptics driver. Note also
 diff --git a/src/eventcomm.c b/src/eventcomm.c
 index faa66ab..7da5a40 100644
 --- a/src/eventcomm.c
 +++ b/src/eventcomm.c
 @@ -269,6 +269,13 @@ event_query_axis_ranges(LocalDevicePtr local)
        }

        xf86Msg(X_PROBED, %s: buttons:%s\n, local-name, buf);
 +
 +       /* clickpad device reports only the single left button mask */
 +       if (priv-has_left  !priv-has_right  !priv-has_middle  
 !priv-has_double) {
 +               priv-is_clickpad = TRUE;
 +               xf86Msg(X_INFO, %s: is Clickpad device\n, local-name);
 +       }
 +
     }
  }

 diff --git a/src/synaptics.c b/src/synaptics.c
 index 53c3685..2e5f8ae 100644
 --- a/src/synaptics.c
 +++ b/src/synaptics.c
 @@ -506,6 +506,18 @@ static void set_default_parameters(LocalDevicePtr local)
         vertResolution = priv-resy;
     }

 +    /* Clickpad mode -- bottom area is used as buttons */
 +    if (priv-is_clickpad) {
 +        int button_bottom;
 +    /* Clickpad devices usually the button area at the bottom, and
 +     * its size seems ca. 20% of the touchpad height no matter how
 +     * large the pad is.
 +     */
 +    button_bottom = priv-maxy - (abs(priv-maxy - priv-miny) * 20) / 100;
 +    if (button_bottom  b  button_bottom = t)
 +        b = button_bottom;
 +    }
 +
     /* set the parameters */
     pars-left_edge = xf86SetIntOption(opts, LeftEdge, l);
     pars-right_edge = xf86SetIntOption(opts, RightEdge, r);
 @@ -2153,6 +2165,59 @@ handle_clickfinger(SynapticsParameters *para, struct 
 SynapticsHwState *hw)
     }
  }

 +/* clickpad event handling */
 +static void
 +HandleClickpad(LocalDevicePtr local, struct SynapticsHwState *hw, edge_type 
 edge)
 +{
 +    SynapticsPrivate *priv = (SynapticsPrivate *) (local-private);
 +    SynapticsParameters *para = priv-synpara;
 +
 +    if (edge  BOTTOM_EDGE) {
 +   /* button area */
 +   int width = priv-maxx - priv-minx;
 +   int left_button_x, right_button_x;
 +
 +  

Re: [PATCH] ClickPad support v4

2010-12-09 Thread Chris Bagwell
On Wed, Dec 8, 2010 at 1:55 AM, Yan Li yan.i...@intel.com wrote:
 This patch adds the support for Synaptics Clickpad devices.
 It requires the change in Linux kernel synaptics input driver, found in
    https://patchwork.kernel.org/patch/92435/
 The kernel patch is already included in 2.6.34 and later releases.

 When the kernel driver sets only the left-button bit evbit and no
 multi-finger is possible, Clickpad mode is activated.  In this mode,
 the bottom touch area is used as button emulations.  Clicking at the
 bottom-left, bottom-center and bottom-right zone corresponds to a left,
 center and right click.

 v2-v3: Fix the mis-detection of Clickpad device with double-tap feature
        (e.g. MacBook)
        Fix one forgotten spacing issue Peter suggested

 v3-v4: Ported to HEAD by Yan Li for MeeGo, also added ClickPad
        description to man page.

 Signed-off-by: Takashi Iwai tiwai at suse.de
 Signed-off-by: Yan Li yan.i...@intel.com
 ---
  man/synaptics.man  |    8 ++
  src/eventcomm.c    |    7 +
  src/synaptics.c    |   72 
 +++-
  src/synapticsstr.h |    2 +
  4 files changed, 88 insertions(+), 1 deletions(-)

 diff --git a/man/synaptics.man b/man/synaptics.man
 index 3f1ca9d..25f1115 100644
 --- a/man/synaptics.man
 +++ b/man/synaptics.man
 @@ -56,6 +56,14 @@ Pressure dependent motion speed.
  .IP \(bu 4
  Run-time configuration using shared memory. This means you can change
  parameter settings without restarting the X server.
 +.IP \(bu 4
 +Synaptics ClickPad support: ClickPad is a new kind of device from
 +Synaptics that has no visible physical keys. Instead, the whole board
 +is clickable and the device sends out BTN_MIDDLE only. It's the
 +driver's duty to judge whether the click is a left or right one
 +according to finger location. If the driver detects that the touchpad
 +has only one button, the ClickPad mode will be activated and handles
 +the action correctly.

I'm not sure a man page entery is needed.  There is nothing
configurable to user.  The single button is more internal knowledge.
Now, if we expose a Clickable property then thats a different story.

  .LP
  Note that depending on the touchpad firmware, some of these features
  might be available even without using the synaptics driver. Note also
 diff --git a/src/eventcomm.c b/src/eventcomm.c
 index faa66ab..7da5a40 100644
 --- a/src/eventcomm.c
 +++ b/src/eventcomm.c
 @@ -269,6 +269,13 @@ event_query_axis_ranges(LocalDevicePtr local)
        }

        xf86Msg(X_PROBED, %s: buttons:%s\n, local-name, buf);
 +
 +       /* clickpad device reports only the single left button mask */
 +       if (priv-has_left  !priv-has_right  !priv-has_middle  
 !priv-has_double) {
 +               priv-is_clickpad = TRUE;
 +               xf86Msg(X_INFO, %s: is Clickpad device\n, local-name);
 +       }
 +

The part about !priv-has_double should probably be removed.  For
synaptics at least, there is a good chance these clickpads will start
reporting has_double with some new kernel patches.

There is talk of ioctl() to query if a touchpad is a clickpad soon as
well but I think single button detection is OK short term.

     }
  }

 diff --git a/src/synaptics.c b/src/synaptics.c
 index 53c3685..2e5f8ae 100644
 --- a/src/synaptics.c
 +++ b/src/synaptics.c
 @@ -506,6 +506,18 @@ static void set_default_parameters(LocalDevicePtr local)
         vertResolution = priv-resy;
     }

 +    /* Clickpad mode -- bottom area is used as buttons */
 +    if (priv-is_clickpad) {
 +        int button_bottom;
 +    /* Clickpad devices usually the button area at the bottom, and
 +     * its size seems ca. 20% of the touchpad height no matter how
 +     * large the pad is.
 +     */
 +    button_bottom = priv-maxy - (abs(priv-maxy - priv-miny) * 20) / 100;
 +    if (button_bottom  b  button_bottom = t)
 +        b = button_bottom;
 +    }
 +
     /* set the parameters */
     pars-left_edge = xf86SetIntOption(opts, LeftEdge, l);
     pars-right_edge = xf86SetIntOption(opts, RightEdge, r);
 @@ -2153,6 +2165,59 @@ handle_clickfinger(SynapticsParameters *para, struct 
 SynapticsHwState *hw)
     }
  }

 +/* clickpad event handling */
 +static void
 +HandleClickpad(LocalDevicePtr local, struct SynapticsHwState *hw, edge_type 
 edge)
 +{
 +    SynapticsPrivate *priv = (SynapticsPrivate *) (local-private);
 +    SynapticsParameters *para = priv-synpara;
 +
 +    if (edge  BOTTOM_EDGE) {
 +   /* button area */
 +   int width = priv-maxx - priv-minx;
 +   int left_button_x, right_button_x;
 +
 +   /* left and right clickpad button ranges;
 +    * the gap between them is interpreted as a middle-button click
 +    */
 +   left_button_x = width * 2 / 5 + priv-minx;
 +   right_button_x = width * 3 / 5 + priv-minx;
 +
 +   /* clickpad reports only one button, and we need
 +    * to fake left/right buttons depending on the touch position
 +    */
 +   if (hw-left) { /* clicked? */
 +       hw-left = 0;
 +       if 

Re: [RFC] Multi-Touch (MT) support - arbitration or not

2010-11-10 Thread Chris Bagwell
On Tue, Nov 9, 2010 at 10:46 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 On Mon, Nov 08, 2010 at 10:14:56PM -0600, Chris Bagwell wrote:

 I've copied #2 below and added my own text in [] to be sure and
 clarify text in context of case #2.

 2.     Report first finger touch as ABS_X/Y events [on touch input
 device] when pen is not in
 prox.  Arbitrating single touch data [on touch input device] when pen
 is in prox. Pen data is
 reported as ABS_X/Y events [on pen input device]. Both ABS_X/Y for pen
 or the first finger
 and ABS_MT_* for MT data are reported [each MT send].  [MT are even
 sent on touch device even though only 1 in proximity tool possible so
 that client can combine both inputs' events and see same behaviour as
 if it was a single input device.]

 Assuming a split device - why do any filtering at all?
 Report ABS_X/Y for pen on the pen device, touch as MT events on the touch
 device _and_ first finger touch as ABS_X/Y on the touch device. If userspace
 can somehow couple the two devices then it's easy enough to filter touch
 events when necessary.

Who is audience of these ABS_X/Y events?  We are not sending them for
MT-aware application benefit.  Matter of fact, it slightly complicates
MT-aware apps because they have to filter them in addition to
filtering MT events.

The real audience of ABS_X/Y is either older apps or simple apps
(people that chose to keep it simple and not support MT events).  This
class of apps can't really be expected to bind two devices logically
and mask.  So without filtering ABS_X/Y on kernel side, we've
basically made Bamboo drivers unusable with a range of apps; which
probably means xf86-input-{evdev|synaptics} (I can't justify adding
binding logic to those two).

I think we also have some high level agreements that we should combine
input devices in kernel long term with minor technical decisions to be
made.  So that makes any logical binding code in xf86-input-*
transitional only.

So by masking in kernel for these special pen+touch 2 input case, its
simple, keeps Bamboo/Ntrig usable with existing
xf86-input-{evdev|synaptics|wacom} and keeps user land from writing
that transitional logic.


 Don't report pen as the only MT set on the pen device, that's just
 confusing.

Yeah, agree.  Its mainly only if it solves some large applications
binding issues but thats theoretically right now.  No real need to do
it.


 Does this answer your question?

Yes.  Thanks.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [RFC] Multi-Touch (MT) support - arbitration or not

2010-11-09 Thread Chris Bagwell
On Tue, Nov 9, 2010 at 12:59 AM, Dmitry Torokhov
dmitry.torok...@gmail.com wrote:
 On Tue, Nov 09, 2010 at 01:31:49PM +1000, Peter Hutterer wrote:

 That said, it also goes counter the whole multi-touch approach - allowing
 more than one device on a single physical device.


 So maybe we should teach wacom to handle all devices as a single input device 
 even
 in cases when they use several USB interfaces? We should be able to
 detect related interfaces by examining intf-intf_assoc (I hope) and
 using usb_driver_claim_interface() to claim them.

Thanks for tips.  I may try it just to prove its possible.

Here is extra info on resolution/dimension issue to also solve when
combining.  Taken from current logic on touch input of Wacom Bamboo:

input_mt_create_slots(input_dev, 2);
input_set_abs_params(input_dev, ABS_MT_POSITION_X, 0, features-x_max,
features-x_fuzz, 0);

Combining 2 inputs means MT slots increases from 2 to 3 (2 touches and
1 stylus).

Today, Pen has x_max=17420, x_fuzz=4 and resolution of 2540.  Also
today, Touch has x_max=15360, x_fuzz=128 and resolution=dunno (we are
scaling up touch x_max in driver and I haven't calculated its affects
on resolution).

I believe that a normalized value of x_max would show that there is a
greater area of tablet can be used for touch then pen.

To handle this difference, we can scale reported values in driver such
that x_max=x_max for all slots.  I'm not clear on fuzz logic so don't
know what 4 vs 128 does.

Or we can maybe update MT interface so you can have per slot values
for clients to query (or MT logic automatically scales for driver as
another option).  Then slots 0-1 are reserved for touch and slot 3 is
reserved for pen.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [RFC] Multi-Touch (MT) support - arbitration or not

2010-11-08 Thread Chris Bagwell
On Mon, Nov 8, 2010 at 2:08 AM, Benjamin Tissoires tisso...@cena.fr wrote:
 Le 08/11/2010 04:51, Peter Hutterer a écrit :

 fwiw, I'm not sure arbitrate is the right word here, filtering seems
 easier to understand in this context. I guess arbitrate would apply more
 if we emit the events across multiple devices like in the bamboo case.
 that's mostly bikeshedding though, my points below apply regardless of
 what
 word we choose :)

 note that we also have two different approaches - single kernel device or
 multiple kernel devices and depending on the approach the device uses the
 options below have different advantages and disadvantages.

 the tablets I've dealt with so far exposed a single event device, so
 that's
 what I'm focusing on in this email.

 On Fri, Nov 05, 2010 at 11:47:28AM -0700, Ping Cheng wrote:

 Recent changes and discussion about MT support at LKML, UDS, and
 xorg-devel encouraged me to migrate Wacom MT devices to the slot-based
 MT protocol (introduced in kernel 2.6.36). Since Wacom supports both
 digitizer and touch devices, I need to decide how to report touch data
 when the pen is in proximity.

 My goal is to understand how X server would like the MT data to be
 reported from the kernel. I hope to keep kernel and X server driver MT
 support in sync so we can avoid unnecessary confusion or extra work in
 the userland.

 The existing solution for single touch events is to arbitrate touch
 when pen is in prox. This is based on the assumption that we do not
 want to have two cursors competing on the screen.

 With the introduction of MT, the touch data are most likely translated
 into something other than pointer events. So, reporting both pen and
 touch data makes sense now. However, I want to assure a smooth
 tansition from single touch to MT for end users so they still get the
 single touch behavior as they used to be. I gathered the following
 approaches:

 1.     Arbitrate all touch data in the kernel.

 This is the simplest solution for device driver developers. But I do
 not feel it is end user and userland client friendly.

 I'm strongly opposed to this. kernel filtering of these devices is hard to
 circumvent and there _will_ be use-cases where we need more than one tool
 to
 work simultaneously. right now we're worrying about pen + touch, but what
 stops tablets from becoming large enough to be used by 2+ users with 2+
 pens simultaneously?

 from a purely event-stream focused viewpoint: why do we even care whether
 something is a pen or a touch? both are just tools and how these should be
 used is mostly up to the clients anyway.  IMO, the whole point of
 MT_TOOL_TYPE is that we don't have to assume use-cases for the tools but
 just forward the information to someone who knows how to deal with this.

 2.     Report first finger touch as ABS_X/Y events when pen is not in
 prox.  Arbitrating single touch data when pen is in prox. Pen data is
 reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger
 and ABS_MT_* for MT data are reported.

 This approach reduces the overhead in dealing with two cursors in
 userland.

 3.    Report first finger touch as ABS_X/Y events when pen is not in
 prox;
        Report pen data as ABS_X/Y events when there is no finger touch;
        Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN
 events when both pen and touch data are received. No ABS_X/Y are
 reported when pen and tocuh or multi-touch data are received.

 I feel this one makes sense to userland since pen can be considered as
 another touch.

 4.    Report first finger touch as ABS_X/Y events when pen is not in
 prox;
        Report pen data as ABS_X/Y events when there is no finger touch;
        Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN
 events when both pen and touch data are received. ABS_X/Y are also
 reported for pen when both pen and tocuh data are received.

 I'd vote for this one. It provides all the data necessary for MT clients
 (and all the data the device can support) but has a reasonable
 single-touch
 strategy. Given that wacom tablets are still primarily pen-centric
 tablets,
 the emphasis on pen overriding touch makes sense to me.

 Hi,

 I'd also vote for this.

 I don't think that the kernel should make any assumption on the final
 application. The data are available, so we have to pass them.

 1. I read that people worry about sending false events (touch) while using
 the pen. But in my mind, this is a _design_ problem of the final
 application. I think the final application will have to filter these events:
 for instance, what happens if the user is too lazy to remove his pen (or
 just want to keep the hover on the application) out of the proximity range
 and want to move its digital sheet of paper in his (her) design application?
 The final application will have to choose whether using or not the touch
 features (depending on the pressure for instance...).

 The solution 4. (*technical solution*) addresses the 

Re: [PATCH x11proto] Add XF86XK_TouchpadOn/Off

2010-11-08 Thread Chris Bagwell
On Sun, Nov 7, 2010 at 11:24 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 Patch is fine with me. any NAKs?

 The udev patch to standardise the behaviour is already in, see
 http://git.kernel.org/?p=linux/hotplug/udev.git;a=commit;h=a1ca5f60e0770299c5c5f21bd371f5823802412b
 It requires an update to xkeyboard-config as well, synchronised with udev to
 maintain consistency. That's possible once this change is in.


And just to stress, at this point we need a commit made somewhere.
Might as well make this one.

udev submission above has changed existing F22 values to F21 so at
minimum to align with them we need to update below value from
xkeyboard-config's inet + synchronized udev release to prevent
breakage of existing feature.

key FK22   {  [ XF86TouchpadToggle]   };

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH x11proto] Add XF86XK_TouchpadOn/Off

2010-11-08 Thread Chris Bagwell
On Mon, Nov 8, 2010 at 4:37 PM, Bastien Nocera had...@hadess.net wrote:
 On Mon, 2010-11-08 at 16:32 -0600, Chris Bagwell wrote:
 On Sun, Nov 7, 2010 at 11:24 PM, Peter Hutterer
 peter.hutte...@who-t.net wrote:
  Patch is fine with me. any NAKs?
 
  The udev patch to standardise the behaviour is already in, see
  http://git.kernel.org/?p=linux/hotplug/udev.git;a=commit;h=a1ca5f60e0770299c5c5f21bd371f5823802412b
  It requires an update to xkeyboard-config as well, synchronised with udev 
  to
  maintain consistency. That's possible once this change is in.
 

 And just to stress, at this point we need a commit made somewhere.
 Might as well make this one.

 udev submission above has changed existing F22 values to F21 so at
 minimum to align with them we need to update below value from
 xkeyboard-config's inet + synchronized udev release to prevent
 breakage of existing feature.

 Except that it never actually worked properly, so you wouldn't be
 breaking that much.


Yep.  I'm not trying to stress the broken part but the mismatch part.

I was motivated by your work and I'm working to get eee pc's working.
In their ACPI driver (eeepc-laptop or eeepc-wmi) they map a hotkey
meant for touchpad toggle to F13 (mapped before udev).  Its been this
way for quite a while and so not to useful out of the box.

I want to get that aligned to F21 or F22 or what ever people agree to
so it can benefit from the nice work you guys did in this area.

It makes sense for me to wait until both udev and xkeyboard-config
match before proceeding though.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [RFC] Multi-Touch (MT) support - arbitration or not

2010-11-08 Thread Chris Bagwell
On Mon, Nov 8, 2010 at 9:31 PM, Peter Hutterer peter.hutte...@who-t.net wrote:
 On Mon, Nov 08, 2010 at 03:54:51PM -0600, Chris Bagwell wrote:

 I think we may be mixing some topics and so I'd like to try to
 re-frame the discussion.

 There are two different cases and they may have different answers
 because of it.

 Case 1) 1 input device can support multiple tools that are in
 proximity at same time.

 I believe this is currently a theoretical example (no driver exists like 
 this).

 if you consider touch to be just another tool, we already have devices that
 support proximity of multiple tools. This isn't theoretical anymore.

Yes, I totally agree there.  I meant more a MT driver with both pen
and touch or really any case were one tool in proximity can invalidate
meaning of other tools in proximity.

2 paragraphs down describes todays 1 input device with MT behaviour.


 In RFC example, this input devices has a pen and 2 finger touches.
 They all share ABS_X/Y/PRESSURE values.  The single touch (ST) input
 filtering breaks being able to support this case and what multitouch
 events (MT) were added for.

 To date, when converting drivers over to MT events the guideline is
 *always* send MT events (because what app wants to randomly switch
 between MT event processing and ST event processing for same
 X/Y/PRESSURE?) and send something sane for ST events to be backwards
 compatible with older apps.

 I think everyone is happy in this thread to always send pen+touch MT
 events and let X drivers or similar filter/arbitrate out unwanted
 touch events as needed.

 The ideal sane behavior for touch ST events has been leaning towards
 tracking 1st touch and continue sending 1st touch during multi-touch
 but there is some debate because tracking can be expensive in kernel.
 In case of pen+touch, the sane may change to prefer pen over touch and
 prefer first touch when 2 touches exist.

 Or sane can mean let the ST values go crazy during multi-touch and
 hope user can use GUI enough after new kernel install to get a
 MT-aware X driver.

 Its easy to implement preferring pen then preferring 1st touch so I
 suggest doing that.  This is for backwards compatibility only
 (un-modified xf86-input-wacom/synaptics/evdev/etc).  The future is MT
 events, in which case the ST events are meaningless and we are hiding
 nothing to applications that look at MT events.

 Case 2) 2 input devices can support multiple tools in proximity at same time.

 I believe it was Rafi that brought up point that dual pen+touch
 interfaces will have different properties.  Touch will be lower
 resolution then Pen and maybe different fuzz factor.  Also, on tablets
 I would think pretty easy to have different dimensions (one tool works
 over larger area of tablet).  This is easy to expose to user when 2
 input devices.

 Yes and no. We're talking about kernel level here and I don't think this
 should be done at this level. The current behaviour of the X driver is to
 split multiple tools up into multiple X devices, so the points above can
 easily be achieved in userspace.


 Combining into single input to user would be nice but at least when
 dimensions are different, we probably do not want to remove that
 visibility to user and so must keep 2 input devices.

 If we run into issues with different axis ranges/resolutions for multiple
 MT_SLOT devices, this should be addressed in the kernel as well.

Yes, that seems a fair statement.

 I feel uncomfortable about splitting up a physical device into multiple
 devices, it takes information away that cannot easily be re-created in the
 userspace. Even with a method of associating multiple event devices to the
 same physical device, the parsing of simultaneous events is harder because
 you're essentially deserialising event streams. In userspace, you have to
 re-serialize based on parallel inputs.

 That said, it also goes counter the whole multi-touch approach - allowing
 more than one device on a single physical device.

Hmm, does this sum up your opinion?  You are a strong proponent of
having all related tools sent over a single input device so you get
natural context of events.  When you do it this way, todays sample MT
implementation for touchpad just work for pen+touch as well.  That
behaviour can basically be summed up with send MT events for all
tools and let clients figure it out.  For older ST events do something
sane to help older apps.

So I get and do agree with that part but you've not clearly stated if
your also saying something like refuse to support split input
solutions and we should fix kernel instead of defining a behaviour for
this case.  If we are forced to support split inputs, I suspect your
basically OK with behaviour #2 because its effectively emulating
single input behaviour as best it can and we are just picking what
sane means in this case odd case.

I've copied #2 below and added my own text in [] to be sure and
clarify text in context of case #2.

2. Report first

Re: [PATCH x11proto] Add XF86XK_TouchpadOn/Off

2010-11-08 Thread Chris Bagwell
On Mon, Nov 8, 2010 at 9:42 PM, Bastien Nocera had...@hadess.net wrote:
 On Mon, 2010-11-08 at 21:17 -0600, Chris Bagwell wrote:
 On Mon, Nov 8, 2010 at 4:37 PM, Bastien Nocera had...@hadess.net wrote:
  On Mon, 2010-11-08 at 16:32 -0600, Chris Bagwell wrote:
  On Sun, Nov 7, 2010 at 11:24 PM, Peter Hutterer
  peter.hutte...@who-t.net wrote:
   Patch is fine with me. any NAKs?
  
   The udev patch to standardise the behaviour is already in, see
   http://git.kernel.org/?p=linux/hotplug/udev.git;a=commit;h=a1ca5f60e0770299c5c5f21bd371f5823802412b
   It requires an update to xkeyboard-config as well, synchronised with 
   udev to
   maintain consistency. That's possible once this change is in.
  
 
  And just to stress, at this point we need a commit made somewhere.
  Might as well make this one.
 
  udev submission above has changed existing F22 values to F21 so at
  minimum to align with them we need to update below value from
  xkeyboard-config's inet + synchronized udev release to prevent
  breakage of existing feature.
 
  Except that it never actually worked properly, so you wouldn't be
  breaking that much.
 

 Yep.  I'm not trying to stress the broken part but the mismatch part.

 I was motivated by your work and I'm working to get eee pc's working.
 In their ACPI driver (eeepc-laptop or eeepc-wmi) they map a hotkey
 meant for touchpad toggle to F13 (mapped before udev).  Its been this
 way for quite a while and so not to useful out of the box.

 I want to get that aligned to F21 or F22 or what ever people agree to
 so it can benefit from the nice work you guys did in this area.

 Then no. You want to use the keys I'm trying to get added in the kernel:
 http://thread.gmane.org/gmane.linux.kernel.input/16320

 Then map from those keys to the fXX function keys in udev. My guess
 from:
  151         { KE_KEY, 0x37, { KEY_F13 } }, /* Disable Touchpad */
  152         { KE_KEY, 0x38, { KEY_F14 } },
 is that one disables the key in hardware, and the other enables it,
 right?

 Then it would be F22 and F23 respectively. So a popup is shown, but
 gnome-settings-daemon doesn't actually change the state of the driver in
 software.


Its a real toggle indication on eee pc's and HW doesn't disable
anything.  If I edit kernel today (Fedora 14) to send F22 instead of
F13 then I get a nice popup and disable/enable of touchpad threw
software.  I guess if I understood udev better I could have done above
instead of recompiling kernel.

It looks like a done deal to get KEY_TOUCHPAD_TOGGLE patch in input.h
so technically I need to make both a kernel update and then a udev
update to map KEY_TOUCHPAD_TOOGLE to F2x.

Either way, it sounds like udev should change for eee pc's.  I'll need
your or someones help to point me towards a udev mailing list when
that time comes.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [RFC] Multi-Touch (MT) support - arbitration or not

2010-11-06 Thread Chris Bagwell
On Fri, Nov 5, 2010 at 1:47 PM, Ping Cheng pingli...@gmail.com wrote:
 Recent changes and discussion about MT support at LKML, UDS, and
 xorg-devel encouraged me to migrate Wacom MT devices to the slot-based
 MT protocol (introduced in kernel 2.6.36). Since Wacom supports both
 digitizer and touch devices, I need to decide how to report touch data
 when the pen is in proximity.

 My goal is to understand how X server would like the MT data to be
 reported from the kernel. I hope to keep kernel and X server driver MT
 support in sync so we can avoid unnecessary confusion or extra work in
 the userland.

 The existing solution for single touch events is to arbitrate touch
 when pen is in prox. This is based on the assumption that we do not
 want to have two cursors competing on the screen.

 With the introduction of MT, the touch data are most likely translated
 into something other than pointer events. So, reporting both pen and
 touch data makes sense now. However, I want to assure a smooth
 tansition from single touch to MT for end users so they still get the
 single touch behavior as they used to be. I gathered the following
 approaches:

 1.     Arbitrate all touch data in the kernel.

 This is the simplest solution for device driver developers. But I do
 not feel it is end user and userland client friendly.

 2.     Report first finger touch as ABS_X/Y events when pen is not in
 prox.  Arbitrating single touch data when pen is in prox. Pen data is
 reported as ABS_X/Y events. Both ABS_X/Y for pen or the first finger
 and ABS_MT_* for MT data are reported.

 This approach reduces the overhead in dealing with two cursors in userland.

 3.    Report first finger touch as ABS_X/Y events when pen is not in prox;
       Report pen data as ABS_X/Y events when there is no finger touch;
       Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN
 events when both pen and touch data are received. No ABS_X/Y are
 reported when pen and tocuh or multi-touch data are received.

 I feel this one makes sense to userland since pen can be considered as
 another touch.

 4.    Report first finger touch as ABS_X/Y events when pen is not in prox;
       Report pen data as ABS_X/Y events when there is no finger touch;
       Report touch data as MT_TOOL_TOUCH and pen data as MT_TOOL_PEN
 events when both pen and touch data are received. ABS_X/Y are also
 reported for pen when both pen and tocuh data are received.

 This one makes sense to userland too. It eases the backward
 compatibility support for those clients that don't support MT at all.

 Which approach do you like? Or do you have other suggestions share?

 Ping

Here is my input on topic.  I'll summarize that I'm leaning towards
your option #1 right now.

First, I think we need to decide if non-MT aware apps/X drivers are to
be supported or if we require them to be fixed to work reliably with
devices such as Wacom Bamboo that split device into two inputs yet
need coordination across inputs.

I prefer that non-MT aware are supported and so we should continue the
current arbitration in kernel over the ST events.  Sending
un-arbitrated MT events to userland I think is mostly OK.

Next, you bring up an interesting point on tablet side.  Currently, it
doesn't send MT events because it only tracks 1 stylus tool at a time.
 Should it send MT events anyways for single tool to help userland
because its events are meant to be associated with another inputs MT
events?  Thats a good question for MT people to consider.

Thinking about that issue though makes we think of option #5 to
consider.  We could combine these 2 input devices into single input
device and send all events as MT events.  I assume userland has easier
time arbitrating within single input device.  I'm not sure what it
would take to combine these in kernel though... and there is some
value in 2 inputs since  just the touchpad can currently be re-routed
to xf86-input-synaptics.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 10/18] Add multi-touch support

2010-10-14 Thread Chris Bagwell
On Wed, Oct 13, 2010 at 1:12 AM, Takashi Iwai ti...@suse.de wrote:
 At Wed, 13 Oct 2010 14:25:16 +1000,
 Peter Hutterer wrote:

 On Fri, Oct 08, 2010 at 07:22:34PM +0200, Takashi Iwai wrote:

  +    if (priv-model != MODEL_SYNAPTICS)
  +   return;
  +    SYSCALL(rc = ioctl(local-fd, EVIOCGBIT(EV_ABS, sizeof(absbits)), 
  absbits));
  +    if (rc = 0  TEST_BIT(ABS_MT_POSITION_X, absbits)) {
  +   priv-can_multi_touch = TRUE;
  +   xf86Msg(X_INFO, %s: supports multi-touch finger detection\n, 
  local-name);
  +    }
  +}
  +

Peter mentioned the intent was to keep this generic from start but he
didn't specifically call out that return statement.  Nothing jumped
out at me as very synaptics specific in patches (except maybe clickpad
detection) so hopefully we can simply delete that return statement.



   static void
   event_query_clickpad(LocalDevicePtr local)
   {
  @@ -175,7 +202,7 @@ event_query_clickpad(LocalDevicePtr local)
 
       /* clickpad device reports only the single left button mask */
       if (priv-has_left  !priv-has_right  !priv-has_middle 
  -   !priv-has_double 
  +   (!priv-has_double || priv-can_multi_touch) 
      priv-model == MODEL_SYNAPTICS) {
      priv-is_clickpad = TRUE;
      /* enable right/middle button caps; otherwise gnome-settings-daemon
  @@ -383,21 +410,27 @@ EventReadHwState(InputInfoPtr pInfo,
      switch (ev.type) {
      case EV_SYN:
          switch (ev.code) {
  +       case SYN_MT_REPORT:
  +           hw-multi_touch_count++;
  +           break;
          case SYN_REPORT:
              if (comm-oneFinger)
  -               hw-numFingers = 1;
  +               hw-numFingers = hw-multi_touch_count ? 
  hw-multi_touch_count : 1;
              else if (comm-twoFingers)
                  hw-numFingers = 2;
              else if (comm-threeFingers)
                  hw-numFingers = 3;
              else
                  hw-numFingers = 0;
  +           hw-multi_touch = hw-multi_touch_count;
  +           hw-multi_touch_count = 0;
              /* if the coord is out of range, we filter it out */
              if (priv-is_clickpad  hw-z  0  (hw-x  minx || hw-x  
  maxx || hw-y  miny || hw-y  maxy))
                      return FALSE;
              *hwRet = *hw;
              return TRUE;
          }
  +       break;
      case EV_KEY:
          v = (ev.value ? TRUE : FALSE);
          switch (ev.code) {
  @@ -458,13 +491,25 @@ EventReadHwState(InputInfoPtr pInfo,
      case EV_ABS:
          switch (ev.code) {
          case ABS_X:
  -           hw-x = ev.value;
  +       case ABS_MT_POSITION_X:
  +           if (hw-multi_touch_count)
  +               hw-multi_touch_x = ev.value;
  +           else
  +               hw-x = ev.value;

 if I read this correctly this patch doesn't add multi-touch support but only
 two-finger support, otherwise you'd be overwriting the value after the
 second finger.

 The odd thing is that there is no third finger tracking.  Synaptics devices
 track up to two finger points although it can detect number of fingers up to
 three.

In another email I asked if it should be OK if BTN_TOOL_DOUBLETAP=0
while sending 2 MT sets.  This is kinda the opposite.
BTN_TOOL_TRIPLETAP=1 but sending only 2 MT sets.  Sounds like we may
need to be relaxed and allow these type mismatches to occur.

I'll have to think a little bit to see if that also means
priv-numFingers should also be calculated based on multi_touch_count.


 So, in my code, I only put the primary and the rest fingers for simplicity
 (so that hw-{x,y,z} notions remain).

Hmm, maybe I'm misreading the above code when combined with kernel
patches.  Kernel side patches were always sending the at rest
fingers as first MT set and the moving fingers second.  The above
seems to give preference to the at rest values which seems odd to
me.  But surely I'm missing something since I believe you've stated
click-and-drag is working.

Also, I'm sure you've already thought about issue when we add
ABS_X/ABS_Y reports back into MT-mode.  Unless we are able to make
ABS_X/ABS_Y always the preferred values on kernel side then we'll need
some flag above to say we are in MT mode and to ignore ST values.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 12/18] Add pinch gesture support

2010-10-14 Thread Chris Bagwell
On Fri, Oct 8, 2010 at 12:22 PM, Takashi Iwai ti...@suse.de wrote:
 + static void
 +handle_multi_touch_pinch(SynapticsPrivate *priv, struct SynapticsHwState *hw,
 +                        int *zoom_in, int *zoom_out)
 +{
 +    SynapticsParameters *para = priv-synpara;
 +    int width = abs(priv-maxx - priv-minx);
 +    int dist, dist_diff, abs_diff;
 +
 +    *zoom_in = *zoom_out = 0;
 +
 +    if (hw-multi_touch = 1 || hw-numFingers  2 ||
 +       (priv-multi_touch_mode  MULTI_TOUCH_MODE_GESTURE 
 +        priv-multi_touch_mode != MULTI_TOUCH_MODE_START))
 +       return; /* no multi-touch or in other mode */
 +    if (para-multi_touch_pinch_dist = 0 ||
 +       para-multi_touch_pinch_start = 0)
 +       return; /* pinch disabled */
 +

When I've implemented gesture logic in the past, I've ran into issues
with 2-finger scrolling and pinch zoom-out looking similar; especially
if user is scrolling in a V pattern and so moving apart as they go.

I bring this up because I didn't notice anything in patch to prevent
the older ST 2-finger scrolling logic from kicking in.  Isn't it
needed?

Also, should that numFingers  2 be numFingers == 2 to prevent
confusion with 3 finger gesture in later patch?  Especially in 3
finger that is V pattern as well.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 07/18] Allow touching in clickpad button area

2010-10-13 Thread Chris Bagwell
On Fri, Oct 8, 2010 at 12:22 PM, Takashi Iwai ti...@suse.de wrote:
 +    prop_touch_button_area = InitAtom(local-dev, 
 SYNAPTICS_PROP_TOUCH_BUTTON_AREA, 32, 1, para-touch_button_area);

[..]

 -    /* Clickpad mode -- bottom area is used as buttons */
 -    if (priv-is_clickpad) {
 -       int button_bottom;
 -       /* Clickpad devices usually the button area at the bottom, and
 -        * its size seems ca. 20% of the touchpad height no matter how
 -        * large the pad is.
 -        */
 -       button_bottom = priv-maxy - (abs(priv-maxy - priv-miny) * 20) / 
 100;
 -       if (button_bottom  b  button_bottom = t)
 -           b = button_bottom;
 -    }
 -

Maybe its worth moving this comment to property setting instead of
deleting.  It is in the man page somewhat though.

Is the 20% statement true for all versions of clickpad?  On kernel
driver thread you mentioned a hinge version at top of clickpad which
sounded like button area grows to majority of touchpad.  Is 20% still
a good rule for filtering those or user expected to modify this
property?

Chase has brought up idea of filtering out these related ST reports on
kernel side.  I proposed a slight alternative on kernel side but
requires MT-mode to be enabled.  I wonder which side (kernel or app)
is best location to do this logic at?

Note: Even if kernel side filters ST events, this patch is still
useful in MT event processing context.

Also, I'm not sure best thread to ask this question so I'll ask here.
When we are sending MT events, should there be any strict rules on how
BTN_TOOL_DOUBLETAP behave.  For example, if kernel filtered out
ABS_X/ABS_Y events during clickpad button press should it also set
BTN_TOOL_DOUBLETAP to 0?  Even if its sending MT reports for both
fingers?

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH evdev v2] Add proximity support.

2010-10-12 Thread Chris Bagwell
Reviewed-by: Chris Bagwell ch...@cnpbagwell.com

I'm not up to speed with queuing in evdev driver.  Mostly I was
concentrating EvdevProcessProximityState() and looked like it covers
all cases.

Chris

On Tue, Oct 12, 2010 at 8:44 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 When one of the tools comes into proximity, queue up a proximity event and
 send it accordingly.

 Includes special handling for tablets that do not send axes with tools
 (#29645)

 Some tablets send axis values, then EV_SYN, and in the next event the
 BTN_TOOL_PEN/BTN_TOUCH, etc. For these tablets, the cursor doesn't move as
 coordinates while not in proximity are ignored.

 Buffer coordinates received while out-of-proximity and if we get a proximity
 event without other coordinates, re-use the last ones received.

 X.Org Bug 29645 http://bugs.freedesktop.org/show_bug.cgi?id=29645

 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
 As Benjamin pointed out, patch 2/3 of the previous set was incorrect by
 itself (rebase -i got the better of me). Because I can't be bothered to
 untangle the two, fix up a commit just to undo half of it in the next one
 anyway, I just squashed the two together.

  src/evdev.c |  118 
 ++-
  src/evdev.h |    5 ++-
  2 files changed, 121 insertions(+), 2 deletions(-)

 diff --git a/src/evdev.c b/src/evdev.c
 index 9e1fb10..0ef7170 100644
 --- a/src/evdev.c
 +++ b/src/evdev.c
 @@ -322,7 +322,18 @@ EvdevQueueButtonEvent(InputInfoPtr pInfo, int button, 
 int value)
         pQueue-key = button;
         pQueue-val = value;
     }
 +}

 +void
 +EvdevQueueProximityEvent(InputInfoPtr pInfo, int value)
 +{
 +    EventQueuePtr pQueue;
 +    if ((pQueue = EvdevNextInQueue(pInfo)))
 +    {
 +        pQueue-type = EV_QUEUE_PROXIMITY;
 +        pQueue-key = 0;
 +        pQueue-val = value;
 +    }
  }

  /**
 @@ -459,6 +470,70 @@ EvdevProcessValuators(InputInfoPtr pInfo, int 
 v[MAX_VALUATORS], int *num_v,
     }
  }

 +static void
 +EvdevProcessProximityEvent(InputInfoPtr pInfo, struct input_event *ev)
 +{
 +    EvdevPtr pEvdev = pInfo-private;
 +
 +    pEvdev-prox = 1;
 +
 +    EvdevQueueProximityEvent(pInfo, ev-value);
 +}
 +
 +/**
 + * Proximity handling is rather weird because of tablet-specific issues.
 + * Some tablets, notably Wacoms, send a 0/0 coordinate in the same EV_SYN as
 + * the out-of-proximity notify. We need to ignore those, hence we only
 + * actually post valuator events when we're in proximity.
 + *
 + * Other tablets send the x/y coordinates, then EV_SYN, then the proximity
 + * event. For those, we need to remember x/y to post it when the proximity
 + * comes.
 + *
 + * If we're not in proximity and we get valuator events, remember that, they
 + * won't be posted though. If we move into proximity without valuators, use
 + * the last ones we got and let the rest of the code post them.
 + */
 +static int
 +EvdevProcessProximityState(InputInfoPtr pInfo)
 +{
 +    EvdevPtr pEvdev = pInfo-private;
 +    int prox_state = 0;
 +    int i;
 +
 +    /* no proximity change in the queue */
 +    if (!pEvdev-prox)
 +    {
 +        if (pEvdev-abs  !pEvdev-proximity)
 +            pEvdev-abs_prox = pEvdev-abs;
 +        return 0;
 +    }
 +
 +    for (i = 0; pEvdev-prox  i  pEvdev-num_queue; i++)
 +    {
 +        if (pEvdev-queue[i].type == EV_QUEUE_PROXIMITY)
 +        {
 +            prox_state = pEvdev-queue[i].val;
 +            break;
 +        }
 +    }
 +
 +    if ((prox_state  !pEvdev-proximity) ||
 +        (!prox_state  pEvdev-proximity))
 +    {
 +        /* We're about to go into/out of proximity but have no abs events
 +         * within the EV_SYN. Use the last coordinates we have. */
 +        if (!pEvdev-abs  pEvdev-abs_prox)
 +        {
 +            pEvdev-abs = pEvdev-abs_prox;
 +            pEvdev-abs_prox = 0;
 +        }
 +    }
 +
 +    pEvdev-proximity = prox_state;
 +    return 1;
 +}
 +
  /**
  * Take a button input event and process it accordingly.
  */
 @@ -583,6 +658,7 @@ EvdevProcessKeyEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
             return;

     switch (ev-code) {
 +        /* keep this list in sync with InitProximityClassDeviceStruct */
         case BTN_TOOL_PEN:
         case BTN_TOOL_RUBBER:
         case BTN_TOOL_BRUSH:
 @@ -591,7 +667,7 @@ EvdevProcessKeyEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
         case BTN_TOOL_FINGER:
         case BTN_TOOL_MOUSE:
         case BTN_TOOL_LENS:
 -            pEvdev-proximity = value ? ev-code : 0;
 +            EvdevProcessProximityEvent(pInfo, ev);
             break;

         case BTN_TOUCH:
 @@ -645,6 +721,27 @@ EvdevPostAbsoluteMotionEvents(InputInfoPtr pInfo, int 
 num_v, int first_v,
     }
  }

 +static void
 +EvdevPostProximityEvents(InputInfoPtr pInfo, int which, int num_v, int 
 first_v,
 +                                  int v[MAX_VALUATORS])
 +{
 +    int i;
 +    EvdevPtr pEvdev = pInfo-private;
 +
 +    for (i = 0

Re: [PATCH 0/3] Input: synaptics - multitouch and multifinger support

2010-10-11 Thread Chris Bagwell
On Mon, Oct 11, 2010 at 2:48 AM, Henrik Rydberg rydb...@euromail.se wrote:
 On 10/11/2010 09:35 AM, Takashi Iwai wrote:
 [...]

 In anyway, feel free to add my sign-off there since I already posted
 my own one as a reference.

 But, I have an open issue with Chase's patch.  Maybe it's rather a
 question to Henrik:

 Shouldn't all points be reported as ABS_MT_* events?  So far, only the
 second touch-point is reported via ABS_MT_* while the first  point is
 still reported as ABX_[X|Y].

 I corrected this in my patch I posted, but I wasn't sure, too.


 I have issues with all submitted patches, but did not give explicit reasons
 since there were overlapping submissions. Perhaps Chase and yourself can work
 out how you want to submit the new patches? And yes, all points should be
 reported as ABS_MT events.

 Thanks,
 Henrik


And is it also safe to say that we need to continue to report
ABS_X/ABS_Y *and* those values need to always track 1st finger touch
for backwards compatibility?

It was brought up in thread but not stated as strong requirement.

BTW, there are patches in last couple months to x86-input-synaptics
that will allow it to ignore jumps in values of ABS_X/ABS_Y when
transition of multi-touch occur (both adding or removing fingers via
BTN_TOOL_*TAP).  So one new-ish option is for ABS_X/ABS_Y to not track
1st finger but become average of 2 fingers.

Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 3/3] Input: synaptics - remove touches over button click area

2010-10-11 Thread Chris Bagwell
On Fri, Oct 8, 2010 at 9:58 AM, Chase Douglas
chase.doug...@canonical.com wrote:
 Now that we have proper multitouch support, we can handle integrated
 buttons better. If we know the top of the buttons on the touchpad, we
 can ignore any touches that occur within the touchpad area while a
 button is clicked. It may be possible to get the button area by querying
 the device, but for now allow the user to manually set it.

 A note on why this works: the Synaptics touchpads have pseudo touch
 tracking. When two touches are on the touchpad, an MT touch packet with
 just the X, Y, and pressure values is sent before a normal Synaptics
 touch packet. When one touch is obviously in motion and the other is
 stationary, the touchpad controller sends the touch in motion in the
 normal packet and the stationary touch in the MT packet. Single touch
 emulation is provided by the normal packet, so an action like clicking
 a button and dragging with another finger still works as expected.

 Tested on a Dell Mini 1012 with synaptics_multitouch=1 and
 synaptics_button_thresh=4100.

 Signed-off-by: Chase Douglas chase.doug...@canonical.com
 ---
  drivers/input/mouse/synaptics.c |   16 +++-
  1 files changed, 15 insertions(+), 1 deletions(-)

 diff --git a/drivers/input/mouse/synaptics.c b/drivers/input/mouse/synaptics.c
 index 7289d88..e67778d 100644
 --- a/drivers/input/mouse/synaptics.c
 +++ b/drivers/input/mouse/synaptics.c
 @@ -49,6 +49,12 @@ module_param(synaptics_multitouch, bool, 0644);
  MODULE_PARM_DESC(synaptics_multitouch,
                 Enable multitouch mode on Synaptics devices);

 +static int synaptics_button_thresh = YMIN_NOMINAL + YMAX_NOMINAL;
 +module_param(synaptics_button_thresh, int, 0644);
 +MODULE_PARM_DESC(synaptics_button_thres,
 +                Y coordinate threshold of integrated buttons on Synaptics 
 +                devices);
 +
  /*
  *     Stuff we need even when we do not want native Synaptics support
  /
 @@ -463,6 +469,10 @@ static void synaptics_parse_hw_state(unsigned char 
 buf[], struct synaptics_data
        }
  }

 +#define TOUCH_OVER_BUTTON(hw) (((hw).left || (hw).middle || (hw).right)  \
 +                              (YMAX_NOMINAL + YMIN_NOMINAL - (hw).y  \
 +                               synaptics_button_thresh))
 +
  /*
  *  called for each full received packet from the touchpad
  */
 @@ -477,7 +487,7 @@ static void synaptics_process_packet(struct psmouse 
 *psmouse)
        synaptics_parse_hw_state(psmouse-packet, priv, hw);

        if (SYN_MULTITOUCH(priv, hw)) {
 -               if (hw.z  0) {
 +               if (hw.z  0  !TOUCH_OVER_BUTTON(hw)) {
                        input_report_abs(dev, ABS_MT_POSITION_X, hw.x);
                        input_report_abs(dev, ABS_MT_POSITION_Y,
                                         YMAX_NOMINAL + YMIN_NOMINAL - hw.y);
 @@ -509,6 +519,10 @@ static void synaptics_process_packet(struct psmouse 
 *psmouse)
                return;
        }

 +       /* If touch occurs over depressed button, ignore it */
 +       if (TOUCH_OVER_BUTTON(hw))
 +               hw.z = 0;
 +
        if (hw.z  0) {
                priv-num_fingers++;
                finger_width = 5;
 --
 1.7.1



I'm convinced now that clickpad style touchpads can't work without
multi-touch and something like logic in xf86-input-multitouch.  So now
I'd like to just consider how the MT-enabled touchpad interface can
best work with non-multitouch aware applications since thats what
users will need to deal with on fresh installs for a while.  I believe
the above approach of setting hw.z to zero would cause havoc on
non-multi-touch aware applications.

I see three main choices:

1) Do not report any button presses when in click area and report
ABS_X/ABS_Y based on first finger touch always.  Something like
xf86-input-synaptics RBCornerButton feature would be responsible for
button presses and can support left/middle/right concepts easily.

The downside is a mis-configured box will not be able to use GUI since
no button presses will work.  Also, there is no clear way to
auto-enable RBCornerButton-like features in user land in the same way
is being done in some patches that consider single button touchpads as
clickpads.

2.1) Send BTN_LEFT when in click area and ABS_X/ABS_Y tracks 1st
finger during 1 touch and 2nd finger during multi-touch.
xf86-input-synaptics needs change to detect left/middle/right based on
ABS_X/ABS_Y values right at report of BTN_LEFT for clickpads.
Touching drag finger before click finger breaks click-and-drag.

2.2) Send BTN_LEFT when in click area and ABS_X/ABS_Y tracks 1st
finger during 1 touch and middle point of 2 fingers during
multi-touch.  Touching drag finger before click finger breaks
click-and-drag and left/middle/right detection.

2.3) Send BTN_LEFT when in click area and 

Re: [PATCH 08/18] Avoid unexpected jumps

2010-10-11 Thread Chris Bagwell
On Fri, Oct 8, 2010 at 12:22 PM, Takashi Iwai ti...@suse.de wrote:
 Limit the movement size for avoiding the unexpected pointer jumps.


Hi Takashi,

This is type of patch I was concerned with.  Using only 1/5 distance
of touchpad to discard invalid packets seems pretty low threshold.

If we can prevent change of ABS_X/ABS_Y only at touch transitions then
we can make use of following commit to prevent unwanted jumps.

Basically, its if (prevFinger != currentFinger) count_packet_finger = 0;.

http://cgit.freedesktop.org/xorg/driver/xf86-input-synaptics/commit/?id=a6ca4d2523904b7ce49edc29ba408979bdf0d45e

Chris

 Signed-off-by: Takashi Iwai ti...@suse.de
 ---
  src/synaptics.c    |   10 ++
  src/synapticsstr.h |    1 +
  2 files changed, 11 insertions(+), 0 deletions(-)

 diff --git a/src/synaptics.c b/src/synaptics.c
 index 3ba918a..bd52730 100644
 --- a/src/synaptics.c
 +++ b/src/synaptics.c
 @@ -467,6 +467,8 @@ static void set_default_parameters(InputInfoPtr pInfo)
     edgeMotionMaxSpeed = diag * .080;
     accelFactor = 200.0 / diag; /* trial-and-error */

 +    priv-move_ptr_threshold = width / 5;
 +
     range = priv-maxp - priv-minp;

     /* scaling based on defaults and a pressure of 256 */
 @@ -1949,6 +1951,14 @@ ComputeDeltas(SynapticsPrivate *priv, const struct 
 SynapticsHwState *hw,
            break;
        }
     }
 +
 +    if (moving_state  priv-count_packet_finger  0 
 +       priv-move_ptr_threshold  0 ) {
 +       int d = move_distance(HIST(0).x - hw-x, HIST(0).y - hw-y);
 +       if (d  priv-move_ptr_threshold)
 +           priv-count_packet_finger = 0; /* to avoid unexpected jumps */
 +    }
 +
     if (inside_area  moving_state  !priv-palm 
        !priv-vert_scroll_edge_on  !priv-horiz_scroll_edge_on 
        !priv-vert_scroll_twofinger_on  !priv-horiz_scroll_twofinger_on 
 diff --git a/src/synapticsstr.h b/src/synapticsstr.h
 index 44140f2..44925e5 100644
 --- a/src/synapticsstr.h
 +++ b/src/synapticsstr.h
 @@ -245,6 +245,7 @@ typedef struct _SynapticsPrivateRec
     unsigned int clickpad_threshold;
     int clickpad_dx, clickpad_dy;
     struct SynapticsHwState prev_hw;   /* previous h/w state (for clickpad) */
 +    int move_ptr_threshold;
     int prop_change_pending;
     Bool led_touch_state;
     Bool led_tapped;
 --
 1.7.3.1

 ___
 xorg-devel@lists.x.org: X.Org development
 Archives: http://lists.x.org/archives/xorg-devel
 Info: http://lists.x.org/mailman/listinfo/xorg-devel

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 3/3] Input: synaptics - remove touches over button click area

2010-10-11 Thread Chris Bagwell
On Mon, Oct 11, 2010 at 12:10 PM, Takashi Iwai ti...@suse.de wrote:
 At Mon, 11 Oct 2010 11:24:04 -0500,
 Chris Bagwell wrote:

 On Fri, Oct 8, 2010 at 9:58 AM, Chase Douglas
 chase.doug...@canonical.com wrote:
  Now that we have proper multitouch support, we can handle integrated
  buttons better. If we know the top of the buttons on the touchpad, we
  can ignore any touches that occur within the touchpad area while a
  button is clicked. It may be possible to get the button area by querying
  the device, but for now allow the user to manually set it.
 
  A note on why this works: the Synaptics touchpads have pseudo touch
  tracking. When two touches are on the touchpad, an MT touch packet with
  just the X, Y, and pressure values is sent before a normal Synaptics
  touch packet. When one touch is obviously in motion and the other is
  stationary, the touchpad controller sends the touch in motion in the
  normal packet and the stationary touch in the MT packet. Single touch
  emulation is provided by the normal packet, so an action like clicking
  a button and dragging with another finger still works as expected.
 
  Tested on a Dell Mini 1012 with synaptics_multitouch=1 and
  synaptics_button_thresh=4100.
 
  Signed-off-by: Chase Douglas chase.doug...@canonical.com
  ---
   drivers/input/mouse/synaptics.c |   16 +++-
   1 files changed, 15 insertions(+), 1 deletions(-)
 
  diff --git a/drivers/input/mouse/synaptics.c 
  b/drivers/input/mouse/synaptics.c
  index 7289d88..e67778d 100644
  --- a/drivers/input/mouse/synaptics.c
  +++ b/drivers/input/mouse/synaptics.c
  @@ -49,6 +49,12 @@ module_param(synaptics_multitouch, bool, 0644);
   MODULE_PARM_DESC(synaptics_multitouch,
                  Enable multitouch mode on Synaptics devices);
 
  +static int synaptics_button_thresh = YMIN_NOMINAL + YMAX_NOMINAL;
  +module_param(synaptics_button_thresh, int, 0644);
  +MODULE_PARM_DESC(synaptics_button_thres,
  +                Y coordinate threshold of integrated buttons on 
  Synaptics 
  +                devices);
  +
   /*
   *     Stuff we need even when we do not want native Synaptics support
   /
  @@ -463,6 +469,10 @@ static void synaptics_parse_hw_state(unsigned char 
  buf[], struct synaptics_data
         }
   }
 
  +#define TOUCH_OVER_BUTTON(hw) (((hw).left || (hw).middle || (hw).right) 
   \
  +                              (YMAX_NOMINAL + YMIN_NOMINAL - (hw).y  \
  +                               synaptics_button_thresh))
  +
   /*
   *  called for each full received packet from the touchpad
   */
  @@ -477,7 +487,7 @@ static void synaptics_process_packet(struct psmouse 
  *psmouse)
         synaptics_parse_hw_state(psmouse-packet, priv, hw);
 
         if (SYN_MULTITOUCH(priv, hw)) {
  -               if (hw.z  0) {
  +               if (hw.z  0  !TOUCH_OVER_BUTTON(hw)) {
                         input_report_abs(dev, ABS_MT_POSITION_X, hw.x);
                         input_report_abs(dev, ABS_MT_POSITION_Y,
                                          YMAX_NOMINAL + YMIN_NOMINAL - 
  hw.y);
  @@ -509,6 +519,10 @@ static void synaptics_process_packet(struct psmouse 
  *psmouse)
                 return;
         }
 
  +       /* If touch occurs over depressed button, ignore it */
  +       if (TOUCH_OVER_BUTTON(hw))
  +               hw.z = 0;
  +
         if (hw.z  0) {
                 priv-num_fingers++;
                 finger_width = 5;
  --
  1.7.1
 
 

 I'm convinced now that clickpad style touchpads can't work without
 multi-touch and something like logic in xf86-input-multitouch.

 Actually Clickpad works without multi-touch patch.  With my patches to
 synaptics, it worked in some level.  There are many restrictions (e.g.
 pushing the button first then drag), though.


True, but if I understand synaptic hardware MT behavior (sends
actively moving finger in higher resolution packet regardless of
original finger touch) then your patch will result in jumpy cursor on
X side and that side would need patches to attempt to guess invalid
data and discard.  I've worked on a few similar patches to various
xf86-input-* and generally they've failed to detect difference between
invalid packets vs. fast user movements.

The main point of my 3 options was to address jumpy cursor in
xf86-input-* that are not MT aware.  I think ABS_X/ABS_Y should only
allow its meaning to change at detectable time periods so user can
account for it and, specifically, that time period is best at
transition of BTN_TOOL_DOUBLETAP.

Assuming its easy enough to support exact rules for ABS_X/ABS_Y
changing meanings on kernel side (which I think it probably is pretty
easy), I think we should do it so that applications don't have to
become MT-aware as the official solution for jumpy cursors.

Chris
___
xorg-devel@lists.x.org

Re: [PATCH evdev 3/3] Add handling for tablets that do not send axes with tools (#29645)

2010-10-11 Thread Chris Bagwell
Reviewed-by: Chris Bagwell ch...@cnpbagwell.com

Hmmm, guess xf86-input-wacom would need similar logic to handle same
non-wacom tablets.

But then again with this series of changes xf86-input-evdev may be
better choice for 1 tool tablets.

Chris

On Sun, Oct 10, 2010 at 6:33 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 Some tablets send axis values, then EV_SYN, and in the next event the
 BTN_TOOL_PEN/BTN_TOUCH, etc. For these tablets, the cursor doesn't move as
 coordinates while not in proximity are ignored.

 Buffer coordinates received while out-of-proximity and if we get a proximity
 event without other coordinates, re-use the last ones received.

 X.Org Bug 29645 http://bugs.freedesktop.org/show_bug.cgi?id=29645

 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  src/evdev.c |   56 
  src/evdev.h |    1 +
  2 files changed, 57 insertions(+), 0 deletions(-)

 diff --git a/src/evdev.c b/src/evdev.c
 index 634c174..0ef7170 100644
 --- a/src/evdev.c
 +++ b/src/evdev.c
 @@ -481,6 +481,60 @@ EvdevProcessProximityEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
  }

  /**
 + * Proximity handling is rather weird because of tablet-specific issues.
 + * Some tablets, notably Wacoms, send a 0/0 coordinate in the same EV_SYN as
 + * the out-of-proximity notify. We need to ignore those, hence we only
 + * actually post valuator events when we're in proximity.
 + *
 + * Other tablets send the x/y coordinates, then EV_SYN, then the proximity
 + * event. For those, we need to remember x/y to post it when the proximity
 + * comes.
 + *
 + * If we're not in proximity and we get valuator events, remember that, they
 + * won't be posted though. If we move into proximity without valuators, use
 + * the last ones we got and let the rest of the code post them.
 + */
 +static int
 +EvdevProcessProximityState(InputInfoPtr pInfo)
 +{
 +    EvdevPtr pEvdev = pInfo-private;
 +    int prox_state = 0;
 +    int i;
 +
 +    /* no proximity change in the queue */
 +    if (!pEvdev-prox)
 +    {
 +        if (pEvdev-abs  !pEvdev-proximity)
 +            pEvdev-abs_prox = pEvdev-abs;
 +        return 0;
 +    }
 +
 +    for (i = 0; pEvdev-prox  i  pEvdev-num_queue; i++)
 +    {
 +        if (pEvdev-queue[i].type == EV_QUEUE_PROXIMITY)
 +        {
 +            prox_state = pEvdev-queue[i].val;
 +            break;
 +        }
 +    }
 +
 +    if ((prox_state  !pEvdev-proximity) ||
 +        (!prox_state  pEvdev-proximity))
 +    {
 +        /* We're about to go into/out of proximity but have no abs events
 +         * within the EV_SYN. Use the last coordinates we have. */
 +        if (!pEvdev-abs  pEvdev-abs_prox)
 +        {
 +            pEvdev-abs = pEvdev-abs_prox;
 +            pEvdev-abs_prox = 0;
 +        }
 +    }
 +
 +    pEvdev-proximity = prox_state;
 +    return 1;
 +}
 +
 +/**
  * Take a button input event and process it accordingly.
  */
  static void
 @@ -732,6 +786,8 @@ EvdevProcessSyncEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
     int v[MAX_VALUATORS] = {};
     EvdevPtr pEvdev = pInfo-private;

 +    EvdevProcessProximityState(pInfo);
 +
     EvdevProcessValuators(pInfo, v, num_v, first_v);

     EvdevPostProximityEvents(pInfo, TRUE, num_v, first_v, v);
 diff --git a/src/evdev.h b/src/evdev.h
 index 08f3c13..af93d41 100644
 --- a/src/evdev.h
 +++ b/src/evdev.h
 @@ -133,6 +133,7 @@ typedef struct {

     int delta[REL_CNT];
     unsigned int abs, rel, prox;
 +    unsigned int abs_prox;  /* valuators posted while out of prox? */

     /* XKB stuff has to be per-device rather than per-driver */
  #if GET_ABI_MAJOR(ABI_XINPUT_VERSION)  5
 --
 1.7.2.3

 ___
 xorg-devel@lists.x.org: X.Org development
 Archives: http://lists.x.org/archives/xorg-devel
 Info: http://lists.x.org/mailman/listinfo/xorg-devel

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH evdev 1/3] Rename evdev-tool to evdev-proximity.

2010-10-11 Thread Chris Bagwell
Reviewed-by: Chris Bagwell ch...@cnpbagwell.com

On Sun, Oct 10, 2010 at 6:33 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 evdev doesn't care about the actual tool used, only that it is used as an
 indicator for proximity. Rename the field accordingly to make the code more
 obvious to read.

 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  src/evdev.c |   18 +-
  src/evdev.h |    2 +-
  2 files changed, 10 insertions(+), 10 deletions(-)

 diff --git a/src/evdev.c b/src/evdev.c
 index 1720f96..9e1fb10 100644
 --- a/src/evdev.c
 +++ b/src/evdev.c
 @@ -363,7 +363,7 @@ EvdevProcessValuators(InputInfoPtr pInfo, int 
 v[MAX_VALUATORS], int *num_v,

     /* convert to relative motion for touchpads */
     if (pEvdev-abs  (pEvdev-flags  EVDEV_RELATIVE_MODE)) {
 -        if (pEvdev-tool) { /* meaning, touch is active */
 +        if (pEvdev-proximity) {
             if (pEvdev-old_vals[0] != -1)
                 pEvdev-delta[REL_X] = pEvdev-vals[0] - pEvdev-old_vals[0];
             if (pEvdev-old_vals[1] != -1)
 @@ -414,11 +414,11 @@ EvdevProcessValuators(InputInfoPtr pInfo, int 
 v[MAX_VALUATORS], int *num_v,
      * pressed.  On wacom tablets, this means that the pen is in
      * proximity of the tablet.  After the pen is removed, BTN_TOOL_PEN is
      * released, and a (0, 0) absolute event is generated.  Checking
 -     * pEvdev-tool here, lets us ignore that event.  pEvdev is
 +     * pEvdev-proximity here lets us ignore that event.  pEvdev is
      * initialized to 1 so devices that doesn't use this scheme still
      * just works.
      */
 -    else if (pEvdev-abs  pEvdev-tool) {
 +    else if (pEvdev-abs  pEvdev-proximity) {
         memcpy(v, pEvdev-vals, sizeof(int) * pEvdev-num_vals);

         if (pEvdev-swap_axes) {
 @@ -591,7 +591,7 @@ EvdevProcessKeyEvent(InputInfoPtr pInfo, struct 
 input_event *ev)
         case BTN_TOOL_FINGER:
         case BTN_TOOL_MOUSE:
         case BTN_TOOL_LENS:
 -            pEvdev-tool = value ? ev-code : 0;
 +            pEvdev-proximity = value ? ev-code : 0;
             break;

         case BTN_TOUCH:
 @@ -636,11 +636,11 @@ EvdevPostAbsoluteMotionEvents(InputInfoPtr pInfo, int 
 num_v, int first_v,
      * pressed.  On wacom tablets, this means that the pen is in
      * proximity of the tablet.  After the pen is removed, BTN_TOOL_PEN is
      * released, and a (0, 0) absolute event is generated.  Checking
 -     * pEvdev-tool here, lets us ignore that event.  pEvdev-tool is
 +     * pEvdev-proximity here lets us ignore that event. pEvdev-proximity is
      * initialized to 1 so devices that don't use this scheme still
      * just work.
      */
 -    if (pEvdev-abs  pEvdev-tool) {
 +    if (pEvdev-abs  pEvdev-proximity) {
         xf86PostMotionEventP(pInfo-dev, TRUE, first_v, num_v, v + first_v);
     }
  }
 @@ -662,7 +662,7 @@ static void EvdevPostQueuedEvents(InputInfoPtr pInfo, int 
 num_v, int first_v,
             break;
         case EV_QUEUE_BTN:
  #if GET_ABI_MAJOR(ABI_XINPUT_VERSION) = 11
 -            if (pEvdev-abs  pEvdev-tool) {
 +            if (pEvdev-abs  pEvdev-proximity) {
                 xf86PostButtonEventP(pInfo-dev, 1, pEvdev-queue[i].key,
                                      pEvdev-queue[i].val, first_v, num_v,
                                      v + first_v);
 @@ -2112,10 +2112,10 @@ EvdevPreInit(InputDriverPtr drv, InputInfoPtr pInfo, 
 int flags)
         goto error;

     /*
 -     * We initialize pEvdev-tool to 1 so that device that doesn't use
 +     * We initialize pEvdev-proximity to 1 so that device that doesn't use
      * proximity will still report events.
      */
 -    pEvdev-tool = 1;
 +    pEvdev-proximity = 1;

     /* Grabbing the event device stops in-kernel event forwarding. In other
        words, it disables rfkill and the Macintosh mouse button emulation.
 diff --git a/src/evdev.h b/src/evdev.h
 index ce7f5f8..b382670 100644
 --- a/src/evdev.h
 +++ b/src/evdev.h
 @@ -124,7 +124,7 @@ typedef struct {
     int old_vals[MAX_VALUATORS]; /* Translate absolute inputs to relative */

     int flags;
 -    int tool;
 +    int proximity;
     int num_buttons;            /* number of buttons */
     BOOL swap_axes;
     BOOL invert_x;
 --
 1.7.2.3

 ___
 xorg-devel@lists.x.org: X.Org development
 Archives: http://lists.x.org/archives/xorg-devel
 Info: http://lists.x.org/mailman/listinfo/xorg-devel

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH 1/3] Input: synaptics - add multitouch support

2010-10-10 Thread Chris Bagwell

On 10/08/2010 09:57 AM, Chase Douglas wrote:

Newer Synaptics devices support multitouch. It appears there is no touch
tracking, so the non-slotted protocol is used.

Multitouch mode is disabled by default because it makes click-and-drag
on touchpads with integrated buttons even more difficult than it already
is. Maybe if/when the X synaptics input module works around this issue
we can enable it by default.


I don't have access to a clickpad and I'm trying to understand its 
unique issues better.  Can you give a little more information on how X 
synaptics driver behaves differently with MT enabled compared to how it 
behaves with MT disabled?


On non-clickpad's, the X/Y will always track close to first finger 
touch.  If clickpad's continue this behaviour in non-MT mode then I'd 
assume click-and-drag will only work if you touch the drag finger before 
the click finger.  If you click first then at best I'd expect extremely 
slow movement since it tracks close but not exactly to first finger.


Does MT mode change behaviour?  Your patch #3 description sounds like 
the non-MT packet tracks moving finger always and so it constantly 
swapping its finger meaning.  Off hand, I'd think that helps 
click-and-drag issue although it creates others.


As example of what issues it creates, I'd expect xf86-input-synaptics to 
go crazy with cursor jumps when its 2 finger gestures are turned off and 
you randomly touch an extra finger to touchpad since the meaning of 
ABS_X/ABS_Y is changing without warning to it (and it doesn't understand 
MT right now).


I agree with the other comments that we want to avoid options as much as 
possible.




Credit goes to Tobyn Bertram for reverse engineering the protocol.

Reported-by: Tobyn Bertram
Signed-off-by: Chase Douglaschase.doug...@canonical.com
---
  drivers/input/mouse/synaptics.c |   78 +++
  drivers/input/mouse/synaptics.h |3 +
  2 files changed, 73 insertions(+), 8 deletions(-)

diff --git a/drivers/input/mouse/synaptics.c b/drivers/input/mouse/synaptics.c
index 96b70a4..990598f 100644
--- a/drivers/input/mouse/synaptics.c
+++ b/drivers/input/mouse/synaptics.c
@@ -44,6 +44,10 @@
  #define YMIN_NOMINAL 1408
  #define YMAX_NOMINAL 4448

+static bool synaptics_multitouch;
+module_param(synaptics_multitouch, bool, 0644);
+MODULE_PARM_DESC(synaptics_multitouch,
+Enable multitouch mode on Synaptics devices);

  /*
   *Stuff we need even when we do not want native Synaptics support
@@ -279,6 +283,22 @@ static void synaptics_set_rate(struct psmouse *psmouse, 
unsigned int rate)
synaptics_mode_cmd(psmouse, priv-mode);
  }

+static void synaptics_set_multitouch_mode(struct psmouse *psmouse)
+{
+   static unsigned char param = 0xc8;
+   struct synaptics_data *priv = psmouse-private;
+
+   if (!SYN_CAP_MULTITOUCH(priv-ext_cap_0c) || !synaptics_multitouch)
+   return;
+   if (psmouse_sliced_command(psmouse, SYN_QUE_MODEL))
+   return;
+   if (ps2_command(psmouse-ps2dev,param, PSMOUSE_CMD_SETRATE))
+   return;
+
+   priv-multitouch = 1;
+   printk(KERN_INFO Synaptics: Multitouch mode enabled\n);
+}
+
  /*
   *Synaptics pass-through PS/2 port support
   /
@@ -362,18 +382,30 @@ static void synaptics_parse_hw_state(unsigned char buf[], 
struct synaptics_data
memset(hw, 0, sizeof(struct synaptics_hw_state));

if (SYN_MODEL_NEWABS(priv-model_id)) {
-   hw-x = (((buf[3]  0x10)  8) |
-((buf[1]  0x0f)  8) |
-buf[4]);
-   hw-y = (((buf[3]  0x20)  7) |
-((buf[1]  0xf0)  4) |
-buf[5]);
-
-   hw-z = buf[2];
hw-w = (((buf[0]  0x30)  2) |
 ((buf[0]  0x04)  1) |
 ((buf[3]  0x04)  2));

+   if (SYN_MULTITOUCH(priv, hw)) {
+   /* Multitouch data is half normal resolution */
+   hw-x = (((buf[4]  0x0f)  8) |
+buf[1])  1;
+   hw-y = (((buf[4]  0xf0)  4) |
+buf[2])  1;
+
+   hw-z = ((buf[3]  0x30) |
+(buf[5]  0x0f))  1;
+   } else {
+   hw-x = (((buf[3]  0x10)  8) |
+((buf[1]  0x0f)  8) |
+buf[4]);
+   hw-y = (((buf[3]  0x20)  7) |
+((buf[1]  0xf0)  4) |
+buf[5]);
+
+   hw-z = buf[2];
+   }
+
hw-left  = (buf[0]  0x01) ? 1 : 0;

Re: [PATCH 1/3] Input: synaptics - add multitouch support

2010-10-10 Thread Chris Bagwell

On 10/08/2010 09:57 AM, Chase Douglas wrote:

Newer Synaptics devices support multitouch. It appears there is no touch
tracking, so the non-slotted protocol is used.

Multitouch mode is disabled by default because it makes click-and-drag
on touchpads with integrated buttons even more difficult than it already
is. Maybe if/when the X synaptics input module works around this issue
we can enable it by default.


I don't have access to a clickpad and I'm trying to understand its 
unique issues better.  Can you give a little more information on how X 
synaptics driver behaves differently with MT enabled compared to how it 
behaves with MT disabled?


On non-clickpad's, the X/Y will always track close to first finger 
touch.  If clickpad's continue this behaviour in non-MT mode then I'd 
assume click-and-drag will only work if you touch the drag finger before 
the click finger.  If you click first then at best I'd expect extremely 
slow movement since it tracks close but not exactly to first finger.


Does MT mode change behaviour?  Your patch #3 description sounds like 
the non-MT packet tracks moving finger always and so it constantly 
swapping its finger meaning.  Off hand, I'd think that helps 
click-and-drag issue although it creates others.


As example of what issues it creates, I'd expect xf86-input-synaptics to 
go crazy with cursor jumps when its 2 finger gestures are turned off and 
you randomly touch an extra finger to touchpad since the meaning of 
ABS_X/ABS_Y is changing without warning to it (and it doesn't understand 
MT right now).


I agree with the other comments that we want to avoid options as much as 
possible.




Credit goes to Tobyn Bertram for reverse engineering the protocol.

Reported-by: Tobyn Bertram
Signed-off-by: Chase Douglaschase.doug...@canonical.com
---
  drivers/input/mouse/synaptics.c |   78 +++
  drivers/input/mouse/synaptics.h |3 +
  2 files changed, 73 insertions(+), 8 deletions(-)

diff --git a/drivers/input/mouse/synaptics.c b/drivers/input/mouse/synaptics.c
index 96b70a4..990598f 100644
--- a/drivers/input/mouse/synaptics.c
+++ b/drivers/input/mouse/synaptics.c
@@ -44,6 +44,10 @@
  #define YMIN_NOMINAL 1408
  #define YMAX_NOMINAL 4448

+static bool synaptics_multitouch;
+module_param(synaptics_multitouch, bool, 0644);
+MODULE_PARM_DESC(synaptics_multitouch,
+Enable multitouch mode on Synaptics devices);

  /*
   *Stuff we need even when we do not want native Synaptics support
@@ -279,6 +283,22 @@ static void synaptics_set_rate(struct psmouse *psmouse, 
unsigned int rate)
synaptics_mode_cmd(psmouse, priv-mode);
  }

+static void synaptics_set_multitouch_mode(struct psmouse *psmouse)
+{
+   static unsigned char param = 0xc8;
+   struct synaptics_data *priv = psmouse-private;
+
+   if (!SYN_CAP_MULTITOUCH(priv-ext_cap_0c) || !synaptics_multitouch)
+   return;
+   if (psmouse_sliced_command(psmouse, SYN_QUE_MODEL))
+   return;
+   if (ps2_command(psmouse-ps2dev,param, PSMOUSE_CMD_SETRATE))
+   return;
+
+   priv-multitouch = 1;
+   printk(KERN_INFO Synaptics: Multitouch mode enabled\n);
+}
+
  /*
   *Synaptics pass-through PS/2 port support
   /
@@ -362,18 +382,30 @@ static void synaptics_parse_hw_state(unsigned char buf[], 
struct synaptics_data
memset(hw, 0, sizeof(struct synaptics_hw_state));

if (SYN_MODEL_NEWABS(priv-model_id)) {
-   hw-x = (((buf[3]  0x10)  8) |
-((buf[1]  0x0f)  8) |
-buf[4]);
-   hw-y = (((buf[3]  0x20)  7) |
-((buf[1]  0xf0)  4) |
-buf[5]);
-
-   hw-z = buf[2];
hw-w = (((buf[0]  0x30)  2) |
 ((buf[0]  0x04)  1) |
 ((buf[3]  0x04)  2));

+   if (SYN_MULTITOUCH(priv, hw)) {
+   /* Multitouch data is half normal resolution */
+   hw-x = (((buf[4]  0x0f)  8) |
+buf[1])  1;
+   hw-y = (((buf[4]  0xf0)  4) |
+buf[2])  1;
+
+   hw-z = ((buf[3]  0x30) |
+(buf[5]  0x0f))  1;
+   } else {
+   hw-x = (((buf[3]  0x10)  8) |
+((buf[1]  0x0f)  8) |
+buf[4]);
+   hw-y = (((buf[3]  0x20)  7) |
+((buf[1]  0xf0)  4) |
+buf[5]);
+
+   hw-z = buf[2];
+   }
+
hw-left  = (buf[0]  0x01) ? 1 : 0;

Re: [PATCH 2/3] Input: synaptics - add multitouch multifinger support

2010-10-10 Thread Chris Bagwell

On 10/08/2010 09:57 AM, Chase Douglas wrote:

Newer multitouch Synaptics trackpads do not advertise multifinger
support. Now that we have multitouch support, we can use the number of
touches to report multifinger functionality.

In conjunction with the X synaptics input module, this enables
functionality such as two finger scrolling.

Signed-off-by: Chase Douglaschase.doug...@canonical.com
---
  drivers/input/mouse/synaptics.c |   24 +---
  drivers/input/mouse/synaptics.h |1 +
  2 files changed, 14 insertions(+), 11 deletions(-)

diff --git a/drivers/input/mouse/synaptics.c b/drivers/input/mouse/synaptics.c
index 990598f..7289d88 100644
--- a/drivers/input/mouse/synaptics.c
+++ b/drivers/input/mouse/synaptics.c
@@ -471,7 +471,6 @@ static void synaptics_process_packet(struct psmouse 
*psmouse)
struct input_dev *dev = psmouse-dev;
struct synaptics_data *priv = psmouse-private;
struct synaptics_hw_state hw;
-   int num_fingers;
int finger_width;
int i;

@@ -483,6 +482,7 @@ static void synaptics_process_packet(struct psmouse 
*psmouse)
input_report_abs(dev, ABS_MT_POSITION_Y,
 YMAX_NOMINAL + YMIN_NOMINAL - hw.y);
input_report_abs(dev, ABS_MT_PRESSURE, hw.z);
+   priv-num_fingers++;
}

input_mt_sync(dev);
@@ -510,13 +510,13 @@ static void synaptics_process_packet(struct psmouse 
*psmouse)
}

if (hw.z  0) {
-   num_fingers = 1;
+   priv-num_fingers++;


In this area of code, its not as obvious your relying on MT packets to 
always come before standard packets.  I think its worth a comment here 
or below on why your initialising priv-num_fingers at bottom of 
processing instead of at top of processing.


It will also help explain to reader why mt_sync events work out as expected.


finger_width = 5;
if (SYN_CAP_EXTENDED(priv-capabilities)) {
switch (hw.w) {
case 0 ... 1:
if (SYN_CAP_MULTIFINGER(priv-capabilities))
-   num_fingers = hw.w + 2;
+   priv-num_fingers = hw.w + 2;
break;
case 2:
if (SYN_MODEL_PEN(priv-model_id))
@@ -528,10 +528,8 @@ static void synaptics_process_packet(struct psmouse 
*psmouse)
break;
}
}
-   } else {
-   num_fingers = 0;
+   } else
finger_width = 0;
-   }

/* Post events
 * BTN_TOUCH has to be first as mousedev relies on it when doing
@@ -555,15 +553,19 @@ static void synaptics_process_packet(struct psmouse 
*psmouse)
if (SYN_CAP_PALMDETECT(priv-capabilities))
input_report_abs(dev, ABS_TOOL_WIDTH, finger_width);

-   input_report_key(dev, BTN_TOOL_FINGER, num_fingers == 1);
+   input_report_key(dev, BTN_TOOL_FINGER, priv-num_fingers == 1);
input_report_key(dev, BTN_LEFT, hw.left);
input_report_key(dev, BTN_RIGHT, hw.right);

-   if (SYN_CAP_MULTIFINGER(priv-capabilities)) {
-   input_report_key(dev, BTN_TOOL_DOUBLETAP, num_fingers == 2);
-   input_report_key(dev, BTN_TOOL_TRIPLETAP, num_fingers == 3);
+   if (SYN_CAP_MULTIFINGER(priv-capabilities) || priv-multitouch) {
+   input_report_key(dev, BTN_TOOL_DOUBLETAP,
+priv-num_fingers == 2);
+   input_report_key(dev, BTN_TOOL_TRIPLETAP,
+priv-num_fingers == 3);
}

+   priv-num_fingers = 0;
+
if (SYN_CAP_MIDDLE_BUTTON(priv-capabilities))
input_report_key(dev, BTN_MIDDLE, hw.middle);

@@ -674,7 +676,7 @@ static void set_input_params(struct input_dev *dev, struct 
synaptics_data *priv)
__set_bit(BTN_LEFT, dev-keybit);
__set_bit(BTN_RIGHT, dev-keybit);

-   if (SYN_CAP_MULTIFINGER(priv-capabilities)) {
+   if (SYN_CAP_MULTIFINGER(priv-capabilities) || priv-multitouch) {
__set_bit(BTN_TOOL_DOUBLETAP, dev-keybit);
__set_bit(BTN_TOOL_TRIPLETAP, dev-keybit);
}
diff --git a/drivers/input/mouse/synaptics.h b/drivers/input/mouse/synaptics.h
index 5126c9c..0989b8d 100644
--- a/drivers/input/mouse/synaptics.h
+++ b/drivers/input/mouse/synaptics.h
@@ -113,6 +113,7 @@ struct synaptics_data {
unsigned char mode; /* current mode byte */
int scroll;
int multitouch; /* Whether device provides MT */
+   unsigned int num_fingers;   /* Number of fingers touching */
  };

  void synaptics_module_init(void);


___

Re: [PATCH 3/3] Input: synaptics - remove touches over button click area

2010-10-10 Thread Chris Bagwell

On 10/08/2010 09:58 AM, Chase Douglas wrote:

Now that we have proper multitouch support, we can handle integrated
buttons better. If we know the top of the buttons on the touchpad, we
can ignore any touches that occur within the touchpad area while a
button is clicked. It may be possible to get the button area by querying
the device, but for now allow the user to manually set it.

A note on why this works: the Synaptics touchpads have pseudo touch
tracking. When two touches are on the touchpad, an MT touch packet with
just the X, Y, and pressure values is sent before a normal Synaptics
touch packet. When one touch is obviously in motion and the other is
stationary, the touchpad controller sends the touch in motion in the
normal packet and the stationary touch in the MT packet. Single touch
emulation is provided by the normal packet, so an action like clicking
a button and dragging with another finger still works as expected.

Tested on a Dell Mini 1012 with synaptics_multitouch=1 and
synaptics_button_thresh=4100.



Even if we did not submit the MT logic, I'd go a totally different 
direction and move clickpad button press support fully to 
xf86-input-synaptics and I'd remove the logic from kernel side that maps 
HW's middle button to left button.  It seems just limping a long with 
single button support anyways.


I haven't had time to review Takashi's xf86-input-synaptics patches just 
sent yet but seems along this line of thinking as well.


Chris
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: Synaptics style multi-touch and cursor jumps

2010-08-20 Thread Chris Bagwell
On Wed, Aug 18, 2010 at 10:03 PM, Chris Bagwell ch...@cnpbagwell.com wrote:
 Suggested xf86-input-synaptics behaviour change:

 Its not a terrible problem but I think could stand to be improved.  I
 propose following change to help further get rid of jumps:

 Start tracking previous finger count.  Modify ComputeDelta() to be
 even more aggressive in resetting delta history on finger transitions.

 * When previousNumFingers  numFingers then require 4 new samples to
 debounce theoretical jumps in #2 on old Synpatics hw and allow
 flexibility of newer hardware as we.  Low priority and debatable.

 * When previousNumFingers  numFingers then require 4 new samples to
 debounce real jump in #1 from real synaptics hw and presumably most
 other HW.  High priority since I can see this easily and why I started
 looking into basic issue.

 I've tested this change and seens to fix my usecase #1.  Some
 additional issue still exists that allows jumps when I touch two
 extremes of touchpad and lift 1st finger only.  Probably unrelated to
 above I'm guessing.

 Chris


I've found my source of cursor jump that still exists after patch.  My
hw needs 2 finger emulation on width.  It seems when I touch extremes,
internally it probably looks like 2 half fingers they are adding up to
1 whole.  So code can do no filtering based on multi-touch concepts in
this one case.

I'll keep any eye out and see if jump occurs in real usage but ignore
otherwise.  I'd prefer to not to visit discarding data based on delta
sizes.
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: [PATCH synaptics 1/2] Added friction physics so coasting can stop on its own.

2010-08-19 Thread Chris Bagwell
On Thu, Aug 19, 2010 at 8:13 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 From: Patrick Curran pjcur...@wisc.edu

 When you are coasting (but not corner coasting) you might want the
 scrolling to slow down and stop on its own.  This also lets you
 start coasting while using a two finger scroll.

 Signed-off-by: Patrick Curran pjcur...@wisc.edu
 Reviewed-by: Peter Hutterer peter.hutte...@who-t.net
 Tested-by: Peter Hutterer peter.hutte...@who-t.net
 Signed-off-by: Peter Hutterer peter.hutte...@who-t.net
 ---
  include/synaptics-properties.h |    2 +-
  man/synaptics.man              |   11 +--
  src/properties.c               |   13 +++--
  src/synaptics.c                |   34 +++---
  src/synapticsstr.h             |    1 +
  tools/synclient.c              |    1 +
  6 files changed, 46 insertions(+), 16 deletions(-)

 diff --git a/include/synaptics-properties.h b/include/synaptics-properties.h
 index cf330d8..9c6a2ee 100644
 --- a/include/synaptics-properties.h
 +++ b/include/synaptics-properties.h
 @@ -130,7 +130,7 @@
  /* 32 bit, 2 values, width, z */
  #define SYNAPTICS_PROP_PALM_DIMENSIONS Synaptics Palm Dimensions

 -/* FLOAT */
 +/* FLOAT, 2 values, speed, friction */
  #define SYNAPTICS_PROP_COASTING_SPEED Synaptics Coasting Speed

  /* 32 bit, 2 values, min, max */
 diff --git a/man/synaptics.man b/man/synaptics.man
 index 590a380..b268a23 100644
 --- a/man/synaptics.man
 +++ b/man/synaptics.man
 @@ -397,10 +397,17 @@ Minimum finger pressure at which touch is considered a 
 palm. Property:
  Synaptics Palm Dimensions
  .TP
  .BI Option \*qCoastingSpeed\*q \*q float \*q
 -Coasting threshold scrolling speed.
 +Your finger needs to produce this many scrolls per second in order to start
 +coasting.  The default is 20 which should prevent you from starting coasting
 +unintentionally.
  .
  0 disables coasting. Property: Synaptics Coasting Speed
  .TP
 +.BI Option \*qCoastingFriction\*q \*q float \*q
 +Number of scrolls per second per second to decrease the coasting speed.  
 Default

   ^ typo?

My only comment.
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel