On 2/3/2011 12:31 PM, Stephan Huber wrote:
No problem, perhaps we should add it to the wiki :)
Having it in the wiki wouldn't hurt. It's here now, along with your link
pointing to the osg-submissions discussion, so anyone scanning osg-users for
"multitouch" will find it.
You might consider commenting the EventQueue header and add an explanation of
why you would want to use which convenience routine, and explain how multitouch
overrides mouse events. I usually look at the header files for guidance first,
then osg-users archives.
Question: Suppose we have this situation:
- Touch point 1 starts (we call touchBegan)
- Touch point 2 starts while point 1 moves (we call touchMoved, with
different phases for each point)
I'd call touchBegan as a new touch began. (and this is how the
ios-implementation works)
OK. I guess it's still kind of ambiguous, to me at least, which EventQueue
convenience method I would want to use. But I guess the bottom line is: What
type of mouse event would I also want to associate with this touch event? (Is
that correct?)
I know this sounds all a bit curious. Some background: there a two ways
to handle multiple touch points:
a) a stream of single events, where every event carry the state of one
touch-point.
b) one event with all available touch-points encapsulated in a custom
data-structure
I used b), because it makes it easier to work with multi-touch-events in
the event-handlers
I, too, would go with option b. But I would've done it using a USER event and a
derived class, in a separate library, without changing core OSG, to allow
backwards compatibility with older versions of OSG.
And I would not have overloaded the mouse events with the multitouch events. A
mouse is a mouse and a touch is a touch, and the application generating the
events can generate one, or the other, or both, so that would've been a more
flexible way to do it, in my opinion. "Tools, not rules" as they say in X Windows.
(The whole implicit support for mouse events seems not just confusing
but unnecessary. Presumably apps that want to emulate the mouse would
just add mouse events at the time they add multitouch events. Just my
opinion.)
The only "emulation" is that the _button-member is set to 1.
...and the event type is set to the same type as you would expect for a mouse
event, thus overloading PUSH, DRAG, and RELEASE.
Perhaps we should make it optional, so developers can disable this
behavior. It's a nice helper for all the examples, and you'll get single
touch out of the box, and it does not interfere when you do only
multi-touch, as your event-handler should check only for
isMultiTouchEvent() and work with the osgGA::GUIEventAdapter::TouchData.
I think all of that could have been achieved by having the code generating the
events decide to generate touch, or mouse, or both. (Pen events, for example,
don't overload mouse events.) But you can do this however you wish on trunk, as
my project is on 2.9.10 and so we have what we have. I just wanted to point out
what appears (to me at least) to be a confusing API and a simpler way to obtain
the same results.
But definitely thanks for your explanation, I appreciate it. Your explanations
make it possible for me to understand the API enough to complete my project.
-Paul
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org