On 2/7/2011 4:06 AM, Serge Lages wrote:
Hi all,

I am currently working on getting multi-touch working on Linux with MPX
(XInput2), and I would like to use this new system for my events. But with
XInput (or also with Windows 7), I am receiving all the events separately, so
what's the best approach to feed the touchesBegan/touchesMoved/touchesEnded
methods ? Will I need to store each input state internally into the
GraphicWindow class to set each time all the touch states ?

Hi Serge -- I was never able to figure out how to use touchBegan / touchMoved / touchEnded. From the EventQueue.cpp source, each of these functions creates a GUIEventAdapter with just a single TouchPoint, so can't be used if you want to send an event containing multiple TouchPoints (as far as I can tell).

Instead, I created my own GUIEventAdapter, added each TouchPoint that my code detected, and called EventQueue::addEvent().

But, more to your question...

My project uses a Kinect to detect hands (more generally, "interactors"). My code correlates the interactors detected in the current frame with the interactors detected in the previous frame (by doing distance computations). So, much like you, I needed to store and track the interactors in my own internal data structure. Then I created a list of TouchPoints with the appropriate phases, and added them to a single GUIEventAdapter as described above, using EventQueue::addEvent rather than the multitouch convenience routines.

Hope that helps,
   -Paul
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to