On 2/3/2011 1:34 AM, Stephan Huber wrote:
(sorry for the previous truncated mail, hit the send button by mistake)
please have a look at the thread on osg-submissions where I explain the
details and concepts of multi-touch to Robert:
http://forum.openscenegraph.org/viewtopic.php?t=7137
Sorry I missed that. (I don't generally search the submissions list for
design/usage discussion, though I know that type of discussion often occurs
there, and have been guilty of it myself!)
This quote from you was the key piece of information I needed:
In addition of the TouchData-structure the helper-methods in EventQueue
simulate for the first touch-point a mouse-PUSH/DRAG/RELEASE with button
1. So, if you are only using one finger, you'll get the same events as
when using a one-button-mouse (without the MOVE-events and with a
populated TouchData-object attached to the GUIEventAdapter)
So the apparent confusion over event type and phase is simply to facilitate
automatic generation of mouse-like events. OK.
Question: Suppose we have this situation:
- Touch point 1 starts (we call touchBegan)
- Touch point 2 starts while point 1 moves (we call touchMoved, with different
phases for each point)
- Touch point 1 ends but touch point 2 continues to move (we call touchEnded,
again with different phases for the two points).
- Touch point 2 continues to move. How do we send this event? Clearly we don't
call touchMoved because this has nothing to do with mouse button 1 (if I
understand you correctly).
- Touch point 2 ends -- same question, how do we send that event?
(The whole implicit support for mouse events seems not just confusing but
unnecessary. Presumably apps that want to emulate the mouse would just add mouse
events at the time they add multitouch events. Just my opinion.)
Thanks,
-Paul
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org