Hi Paul,
Am 03.02.11 19:00, schrieb Paul Martz:
> On 2/3/2011 1:34 AM, Stephan Huber wrote:
>> (sorry for the previous truncated mail, hit the send button by mistake)
>>
>> please have a look at the thread on osg-submissions where I explain the
>> details and concepts of multi-touch to Robert:
>>
>> http://forum.openscenegraph.org/viewtopic.php?t=7137
>
> Sorry I missed that. (I don't generally search the submissions list for
> design/usage discussion, though I know that type of discussion often
> occurs there, and have been guilty of it myself!)
No problem, perhaps we should add it to the wiki :)
> This quote from you was the key piece of information I needed:
>
> In addition of the TouchData-structure the helper-methods in EventQueue
> simulate for the first touch-point a mouse-PUSH/DRAG/RELEASE with
> button
> 1. So, if you are only using one finger, you'll get the same events as
> when using a one-button-mouse (without the MOVE-events and with a
> populated TouchData-object attached to the GUIEventAdapter)
>
> So the apparent confusion over event type and phase is simply to
> facilitate automatic generation of mouse-like events. OK.
There's no specific touch-event-type in osgGA::EventType, the current
implementation uses PUSH for touchBegan, RELEASE for touchEnded and DRAG
for touchMoved. The "emulation" sets only the mouse-button to the left
button to satisfy all the mouse-centric event-handlers.
> Question: Suppose we have this situation:
> - Touch point 1 starts (we call touchBegan)
> - Touch point 2 starts while point 1 moves (we call touchMoved, with
> different phases for each point)
I'd call touchBegan as a new touch began. (and this is how the
ios-implementation works)
> - Touch point 1 ends but touch point 2 continues to move (we call
> touchEnded, again with different phases for the two points).
> - Touch point 2 continues to move. How do we send this event? Clearly
> we don't call touchMoved because this has nothing to do with mouse
> button 1 (if I understand you correctly).
as it is a touch-moved event I'd call touchMoved (this is how the
ios-implementation works) -- the touch-moved has only one touch-point
attached.
> - Touch point 2 ends -- same question, how do we send that event?
as the particular touch ends, we should call touchEnded with one
touch-point.
I know this sounds all a bit curious. Some background: there a two ways
to handle multiple touch points:
a) a stream of single events, where every event carry the state of one
touch-point.
b) one event with all available touch-points encapsulated in a custom
data-structure
I used b), because it makes it easier to work with multi-touch-events in
the event-handlers as with solution a) you'll need some special events
to signalize the start and the end of a stream of multi-touch-events.
And you'll need some logic, storage and state-handling in your
event-handler, it has to capture all touch-points for a specific frame,
store them and process them, when all touch-events "arrived". I thought
that this is too much hassle, and implemented b)
So there's a high chance that you'll get a stream of events like this:
PUSH (1 tp) - DRAG (1 tp) - PUSH (2 tp) - DRAG (2 tp) - RELEASE (2 tp) -
DRAG (1 tp) - RELEASE (1 tp)
> (The whole implicit support for mouse events seems not just confusing
> but unnecessary. Presumably apps that want to emulate the mouse would
> just add mouse events at the time they add multitouch events. Just my
> opinion.)
The only "emulation" is that the _button-member is set to 1.
Perhaps we should make it optional, so developers can disable this
behavior. It's a nice helper for all the examples, and you'll get single
touch out of the box, and it does not interfere when you do only
multi-touch, as your event-handler should check only for
isMultiTouchEvent() and work with the osgGA::GUIEventAdapter::TouchData.
So the event-handler for multi-touch looks something like this:
switch(ea.getEventType())
{
case osgGA::GUIEventAdapter::PUSH:
case osgGA::GUIEventAdapter::DRAG:
case osgGA::GUIEventAdapter::RELEASE:
if (ea.isMultiTouchEvent())
{
osgGA::GUIEventAdapter::TouchData* data = ea.getTouchData();
// check all the touch-points and do something
}
break;
...
}
Please note, that this is a first version of the multi-touch-handling,
so we have something we can talk about and find a good solution which
suits other multi-touch-implementations, too. I used the event-handling
on ios-devices as a guideline.
cheers,
Stephan
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org