On 03/01/2010 03:14 PM, ext Daniel Stone wrote:
And from the NSTouch (OS X) class documentation:
     Touches do not have a corresponding screen location. The first touch
     of a touch collection is latched to the view underlying the cursor
     using the same hit detection as mouse events.  Additional touches on
     the same device are also latched to the same view as any other
     touching touches.  A touch remains latched to its view until the
     touch has either ended or is cancelled.

Very quickly, since I forgot to mention it in my original reply:

Bear in mind that this documentation is for OS X, and current Apple hardware only has multi-touch support via track-pads. There is no touch-screen hardware running Mac OS X today. It's either iPhone, iPod touch, or the upcoming iPad. iPhone and iPod touch use a different API than the above, UITouch instead of NSTouch, which actually does give on-screen locations (iirc).

The multi-focii discussions really only apply to multi-touch capable touch screens, not to laptop track-pads or external multi-touch capable tablets like some (all?) of the Wacom Bamboo tablets.

--
Bradley T. Hughes (Nokia-D-Qt/Oslo), bradley.hughes at nokia.com
Sandakervn. 116, P.O. Box 4332 Nydalen, 0402 Oslo, Norway
_______________________________________________
xorg-devel mailing list
[email protected]
http://lists.x.org/mailman/listinfo/xorg-devel

Reply via email to