Hi Robert, Am 22.11.10 17:11, schrieb Robert Osfield: > What happens with the touchDrag and touchEnded? Do these relate at > all to the individual phase of each touch? It seems to be that there > is an overlap of definition between the whole multi-touch event and > the individual event.
It's the same logic as with touchBegan. With my approach the logic is as follows: if you touch with two fingers at the same time you'll get one PUSH-event with 2 touch-points. if you drag both or only one finger you'll get one DRAG-event. If you touch the surface with one finger and after a short while with a second finger you'll get two PUSH-events, the first with one touch-point, the second event with two touch-points. There's some sort of redundancy (PUSH vs. TOUCH_BEGAN, etc) but implementing a group of GUIEventAdapters or similar approaches appeared to me too difficult to manage and to understand, as every GUIEventAdapter has so many more properties totally unrelated to a touch-event. I wanted to keep it clean and simple. > I do wonder if the multi-touch event list might be part of the global > event state rather than an specific event. I don't have any answers > I'm just trying to get my head out things. you'll loose the mouse compatibility when using multi-touch-devices, and you'll have to implement all the manipulators from the ground up, because they are sooo mouse-centric. With the current approach, you have a small chance to extend them. just my 2 cents, if you all come to the conclusion my approach is not the best I am happy to code a better replacement with some guidance :) cheers, Stephan _______________________________________________ osg-submissions mailing list [email protected] http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org
