On Tuesday July 1, [EMAIL PROTECTED] wrote:
> Hi,
> 
> I was trying to teach mousedev to bind to new synaptics driver. This 
> may be useful for gpm and other programs that don't have native event
> processing module written yet. Unfortunately absolute to relative 
> conversion in mousedev only suits for tablets (digitizers) and not for
> touchpads because:
> 
> - touchpads are not precise; when I take my finger off touchpad and then
>   touch it again somewhere else I do not expect my cursor jump anywhere,
>   I only expect cursor to move when I move my finger while pressing 
>   touchpad.

Yep, that's a bug.  mousedev needs to know when there is or isn't a
finger and ignore absolute differences when there is no finger.

> - synaptics has Y axis reversed from what mousedev expects.
> 
> I tried to modify mousedev to account for differences between touchpads 
> in absolute mode and digitizers in absolute mode but all my solutions 
> required ugly flags - brrr... So what if we:
> 
> 1. Modify mousedev so if an input device announces that it generates both
>    relative and absolute events mousedev will discard all absolute axis 
>    events and will rely on device supplied relative events.

Nah.  I have an ALPS dualpoint which generates ABSolute events for the
touchpad part and RElative events for the pointstick part.  I want
them both.

> 2. Add absolute->relative conversion code to touchpad drivers themselves
>    as drivers should know the best how to do that. If they turn out to be
>    similar across different touchpads then the common module could be 
>    made.
> 
> What you think?

I think that mousedev should be just clever enough to mostly work and
no cleverer.  Anything more interesting should be done in user-space.

For example, there is the idea of treating one edge of a touch pad
like a scroll wheel.  I don't think this should be done in the
kernel. 
I have just started working on a program to do this.  It opens
/dev/input/event1 (in my case) to read events from the touchpad.  It
uses an ioctl to "grab" the device so mousedev doesn't see any events
directly.
Then it opens /dev/input/uinput and registers itself as a mouse-like
device.  It copies event from one to the other with some translation.
It will eventually do all the magic abs->rel translations that I want
and notice corner-taps and such.  At the moment it only does some
simple mapping of button events so that I can use one of my two 'left'
buttons as a 'middle' button.

I actually think that it should be possible to do everything in
user-space.  I find it quite awkward that I *cannot* get a clear
channel from a user-space program to the PS/2 aux port like I could in
2.4 via /dev/psaux.  I would really like to be able to open some
device, "GRAB" it, and then have complete control of the PS/2 AUX
port.  Then I could similarly open /dev/input/uinput and do exactly
what translations I wanted.

NeilBrown


-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100006ave/direct;at.asp_061203_01/01
_______________________________________________
[EMAIL PROTECTED]
To unsubscribe, use the last form field at:
https://lists.sourceforge.net/lists/listinfo/linux-usb-devel

Reply via email to