Re: [osg-users] Generating multitouch events

2011-12-29 Thread Stephan Huber
Hi Len,

Am 14.12.11 21:39, schrieb Len White:
 Just to make sure I understand the structure of the current osgGA multi-touch 
 support...
 If I have a touch system that collects touch events in a callback, in my case 
 TUIO events, I will likely need to use a timer to store up a bunch of events, 
 bundle them into a single GUIEventAdapter, and then send them into the 
 EventQueue, correct?
 In my case I have a home-made touch surface that detects finger presses and 
 creates a listener thread for individual TUIO events.
 So far I've written a conversion class that can handle single touches fine, 
 but won't work for multiple touches.

Yes, you are correct. This is how multitouch on os x and ios works and
makes it easier for code working with osgGA::GUIEventAdapter, as all
touch events for one frame are stored in one event.

HTH,
Stephan
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-12-29 Thread Stephan Huber
Hi Len,

here's some pseudo-code which demonstrates the population of a
osgGA::GUIEventAdapter:




osg::ref_ptrosgGA::GUIEventAdapter osg_event(NULL);

for(unsigned int i=0; i  num_touch_points; i++)
{
// to differentiate different touches over time
unsigned int touch_id = touch_id_for_touch_i;

// in what phase is this touch-point
// (began, moved, stationary, ended)
osgGA::GUIEventAdapter::TouchPhase touch_phase = ...;

// the origin lies in the lower left corner
osg::Vec2 pixelPos = touch_pos_of_touch_i;

if (!osg_event)
{
// fire touchBegan/touchMoved/touchEnded
osg_event = getEventQueue()-touchBegan(
touch_id,
touch_phase,
pixelPos.x(),
pixelPos.y()
);
 }
 else
 {
 osg_event-addTouchPoint(
touch_id,
touch_phase,
pixelPos.x(),
pixelPos.y()
 );
 }
}

So in your tuio-code you'll need to store and update a mapping of all
your touch-points and create from that the GUIEventAdapter.

The api is currently still under development, so if you have some ideas
/ modifications we should consider, pleas step forward :)

cheers,

Stephan



Am 29.12.11 12:38, schrieb Stephan Huber:
 Hi Len,
 
 Am 14.12.11 21:39, schrieb Len White:
 Just to make sure I understand the structure of the current osgGA 
 multi-touch support...
 If I have a touch system that collects touch events in a callback, in my 
 case TUIO events, I will likely need to use a timer to store up a bunch of 
 events, bundle them into a single GUIEventAdapter, and then send them into 
 the EventQueue, correct?
 In my case I have a home-made touch surface that detects finger presses and 
 creates a listener thread for individual TUIO events.
 So far I've written a conversion class that can handle single touches fine, 
 but won't work for multiple touches.
 
 Yes, you are correct. This is how multitouch on os x and ios works and
 makes it easier for code working with osgGA::GUIEventAdapter, as all
 touch events for one frame are stored in one event.
 
 HTH,
 Stephan
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-12-27 Thread Len White
Hi.
Just to make sure I understand the structure of the current osgGA multi-touch 
support...
If I have a touch system that collects touch events in a callback, in my case 
TUIO events, I will likely need to use a timer to store up a bunch of events, 
bundle them into a single GUIEventAdapter, and then send them into the 
EventQueue, correct?
In my case I have a home-made touch surface that detects finger presses and 
creates a listener thread for individual TUIO events.
So far I've written a conversion class that can handle single touches fine, but 
won't work for multiple touches.

Thanks for the clarification.

-Len

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=44388#44388





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-08 Thread Serge Lages
Hi,

Thanks for your replies, it confirms what I was thinking... The iOS way of
sending events seems to be an exception, all the touch technologies I use
(mostly MPX, Windows 7 and TUIO) send them separately, so I think it should
be useful to add to EventQueue a way to track the events to put them
altogether into the GUIEventAdapter, what do you think ?

Cheers,

On Mon, Feb 7, 2011 at 8:45 PM, Stephan Huber ratzf...@digitalmind.dewrote:

 Hi Glenn,

 Am 07.02.11 12:06, schrieb Serge Lages:
  I am currently working on getting multi-touch working on Linux with MPX
  (XInput2), and I would like to use this new system for my events. But
 with
  XInput (or also with Windows 7), I am receiving all the events
 separately,
  so what's the best approach to feed the
  touchesBegan/touchesMoved/touchesEnded methods ? Will I need to store
 each
  input state internally into the GraphicWindow class to set each time all
 the
  touch states ?

 You'll have to store your touch-points and submit them alltogether as
 one event to the eventqueue, here's some pseudo-code:

 osg::ref_ptrosgGA::GUIEventAdapter osg_event(NULL);
 for(int i = 0; i  numTouches; i++)
 {
  // get touch i and corrsponding phase, x and y
  ...
  // feed it to the osg_event
  if (!osg_event) {
osg_event = _win-getEventQueue()-touchBegan(i, phase, x, y);
  } else {
osg_event-addTouchPoint(i, phase, x, y);
  }
 }

 As Paul noticed in one of his recent mails, the design of the current
 implementation is not the easiest and cleanest,  but I was happy to have
 something working on my end. So if you have any improvements to the
 design/code, please share them with us, so we get a robust and clean
 multi-touch implementation working consistent on different platforms.

 cheers,
 Stephan
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org




-- 
Serge Lages
http://www.tharsis-software.com
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-08 Thread Stephan Maximilian Huber
Hi Serge,

Am 08.02.11 09:12, schrieb Serge Lages:
 Thanks for your replies, it confirms what I was thinking... The iOS way of
 sending events seems to be an exception, all the touch technologies I use
 (mostly MPX, Windows 7 and TUIO) send them separately, so I think it should
 be useful to add to EventQueue a way to track the events to put them
 altogether into the GUIEventAdapter, what do you think ?

Sounds reasonable :) Perhaps it's a good idea to enhance/refactor the
TouchData-class, so you can use it as a storage for all your
touch-points and add a cloned copy via a GUIEventAdapter to the
event-queue... just an idea.



cheers,
Stephan

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-07 Thread Serge Lages
Hi all,

I am currently working on getting multi-touch working on Linux with MPX
(XInput2), and I would like to use this new system for my events. But with
XInput (or also with Windows 7), I am receiving all the events separately,
so what's the best approach to feed the
touchesBegan/touchesMoved/touchesEnded methods ? Will I need to store each
input state internally into the GraphicWindow class to set each time all the
touch states ?

Cheers,

-- 
Serge Lages
http://www.tharsis-software.com
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-07 Thread Paul Martz

On 2/7/2011 4:06 AM, Serge Lages wrote:

Hi all,

I am currently working on getting multi-touch working on Linux with MPX
(XInput2), and I would like to use this new system for my events. But with
XInput (or also with Windows 7), I am receiving all the events separately, so
what's the best approach to feed the touchesBegan/touchesMoved/touchesEnded
methods ? Will I need to store each input state internally into the
GraphicWindow class to set each time all the touch states ?


Hi Serge -- I was never able to figure out how to use touchBegan / touchMoved / 
touchEnded. From the EventQueue.cpp source, each of these functions creates a 
GUIEventAdapter with just a single TouchPoint, so can't be used if you want to 
send an event containing multiple TouchPoints (as far as I can tell).


Instead, I created my own GUIEventAdapter, added each TouchPoint that my code 
detected, and called EventQueue::addEvent().


But, more to your question...

My project uses a Kinect to detect hands (more generally, interactors). My 
code correlates the interactors detected in the current frame with the 
interactors detected in the previous frame (by doing distance computations). So, 
much like you, I needed to store and track the interactors in my own internal 
data structure. Then I created a list of TouchPoints with the appropriate 
phases, and added them to a single GUIEventAdapter as described above, using 
EventQueue::addEvent rather than the multitouch convenience routines.


Hope that helps,
   -Paul
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-07 Thread David Glenn
Greetings Paul!

Well, if you only have to deal with three mouse points with Kinect that should 
be much easier than Smart board. Last year at NAB, I talked to the designer and 
he told me that you have to look over eight mouse points at any given time - 
that I can remember!

... 
D Glenn


D Glenn (a.k.a David Glenn) - Moving Heaven and Earth!

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=36437#36437





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-07 Thread Stephan Huber
Hi Glenn,

Am 07.02.11 12:06, schrieb Serge Lages:
 I am currently working on getting multi-touch working on Linux with MPX
 (XInput2), and I would like to use this new system for my events. But with
 XInput (or also with Windows 7), I am receiving all the events separately,
 so what's the best approach to feed the
 touchesBegan/touchesMoved/touchesEnded methods ? Will I need to store each
 input state internally into the GraphicWindow class to set each time all the
 touch states ?

You'll have to store your touch-points and submit them alltogether as
one event to the eventqueue, here's some pseudo-code:

osg::ref_ptrosgGA::GUIEventAdapter osg_event(NULL);
for(int i = 0; i  numTouches; i++)
{
  // get touch i and corrsponding phase, x and y
  ...
  // feed it to the osg_event
  if (!osg_event) {
osg_event = _win-getEventQueue()-touchBegan(i, phase, x, y);
  } else {
osg_event-addTouchPoint(i, phase, x, y);
  } 
}

As Paul noticed in one of his recent mails, the design of the current
implementation is not the easiest and cleanest,  but I was happy to have
something working on my end. So if you have any improvements to the
design/code, please share them with us, so we get a robust and clean
multi-touch implementation working consistent on different platforms.

cheers,
Stephan
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-03 Thread Stephan Huber
Hi Paul,

please have a look at the thread on osg-submissions where I explain the
details and concepts of multi-touch to Robert:

Am 03.02.11 02:56, schrieb Paul Martz:
 I'm working on a project that needs to generate multitouch events, and I
 have a question about the current (2.9.10) implementation.
 
 There are three main EventQueue methods for adding three different
 multitouch events:
   touchBegan()
   touchMoved()
   touchEnded()
 And their purpose is quite clear from their names.
 
 However, all three of them take a TouchPhase parameter:
 enum TouchPhase {
 TOUCH_UNKNOWN,
 TOUCH_BEGAN,
 TOUCH_MOVED,
 TOUCH_STATIONERY,
 TOUCH_ENDED
 };
 
 It's not clear to me what it would mean if I were to, for example, call
 touchBegan, and pass in TOUCH_ENDED for the phase. This just doesn't
 make sense to specify a phase that contradicts with the type of event
 I'm adding.
 
 I must be misunderstanding the purpose of this interface. If you could
 provide some usage insight I would appreciate it. Thanks!

Ok, I'll try my best: An osg-event encapsulate all touch-points in one
event, so an osg event handler get all touchpoints via one event. From
the system side (IOS) I'll get one event with all touch-points (that
might differ for other multi-touch-implementations).

Here's a real life example:

one finger touch started - one osg-event with touchBegan(TOUCH_BEGAN)
one finger is moving - one osg-event with touchMoved(TOUCH_MOVED)
one finger released - one osg-event with touchEnded(TOUCH_ENDED)

So to make it complicate and to illustrate the usage better:

one finger touch started - one osg-event with touchBegan(TOUCH_BEGAN)
one finger is moving - one osg-event with touchMoved(TOUCH_MOVED)

a second touch is registered, while the other touch-point is still valid:
a new osg-event is created with touchBegan




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-03 Thread Stephan Huber
Hi Paul,

(sorry for the previous truncated mail, hit the send button by mistake)

please have a look at the thread on osg-submissions where I explain the
details and concepts of multi-touch to Robert:

http://forum.openscenegraph.org/viewtopic.php?t=7137

I hope this helps for getting started. If there are any questions left,
I'll try my best to answer them.

cheers,
Stephan

Am 03.02.11 02:56, schrieb Paul Martz:
 I'm working on a project that needs to generate multitouch events, and I
 have a question about the current (2.9.10) implementation.
 
 There are three main EventQueue methods for adding three different
 multitouch events:
   touchBegan()
   touchMoved()
   touchEnded()
 And their purpose is quite clear from their names.
 
 However, all three of them take a TouchPhase parameter:
 enum TouchPhase {
 TOUCH_UNKNOWN,
 TOUCH_BEGAN,
 TOUCH_MOVED,
 TOUCH_STATIONERY,
 TOUCH_ENDED
 };
 
 It's not clear to me what it would mean if I were to, for example, call
 touchBegan, and pass in TOUCH_ENDED for the phase. This just doesn't
 make sense to specify a phase that contradicts with the type of event
 I'm adding.
 
 I must be misunderstanding the purpose of this interface. If you could
 provide some usage insight I would appreciate it. Thanks!



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-03 Thread Paul Martz

On 2/3/2011 1:34 AM, Stephan Huber wrote:

(sorry for the previous truncated mail, hit the send button by mistake)

please have a look at the thread on osg-submissions where I explain the
details and concepts of multi-touch to Robert:

http://forum.openscenegraph.org/viewtopic.php?t=7137


Sorry I missed that. (I don't generally search the submissions list for 
design/usage discussion, though I know that type of discussion often occurs 
there, and have been guilty of it myself!)


This quote from you was the key piece of information I needed:

In addition of the TouchData-structure the helper-methods in EventQueue
simulate for the first touch-point a mouse-PUSH/DRAG/RELEASE with button
1. So, if you are only using one finger, you'll get the same events as
when using a one-button-mouse (without the MOVE-events and with a
populated TouchData-object attached to the GUIEventAdapter)

So the apparent confusion over event type and phase is simply to facilitate 
automatic generation of mouse-like events. OK.


Question: Suppose we have this situation:
 - Touch point 1 starts (we call touchBegan)
 - Touch point 2 starts while point 1 moves (we call touchMoved, with different 
phases for each point)
 - Touch point 1 ends but touch point 2 continues to move (we call touchEnded, 
again with different phases for the two points).
 - Touch point 2 continues to move. How do we send this event? Clearly we don't 
call touchMoved because this has nothing to do with mouse button 1 (if I 
understand you correctly).

 - Touch point 2 ends -- same question, how do we send that event?

(The whole implicit support for mouse events seems not just confusing but 
unnecessary. Presumably apps that want to emulate the mouse would just add mouse 
events at the time they add multitouch events. Just my opinion.)


Thanks,
   -Paul
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-03 Thread Stephan Huber
Hi Paul,

Am 03.02.11 19:00, schrieb Paul Martz:
 On 2/3/2011 1:34 AM, Stephan Huber wrote:
 (sorry for the previous truncated mail, hit the send button by mistake)

 please have a look at the thread on osg-submissions where I explain the
 details and concepts of multi-touch to Robert:

 http://forum.openscenegraph.org/viewtopic.php?t=7137
 
 Sorry I missed that. (I don't generally search the submissions list for
 design/usage discussion, though I know that type of discussion often
 occurs there, and have been guilty of it myself!)

No problem, perhaps we should add it to the wiki :)

 This quote from you was the key piece of information I needed:
 
 In addition of the TouchData-structure the helper-methods in EventQueue
 simulate for the first touch-point a mouse-PUSH/DRAG/RELEASE with
 button
 1. So, if you are only using one finger, you'll get the same events as
 when using a one-button-mouse (without the MOVE-events and with a
 populated TouchData-object attached to the GUIEventAdapter)
 
 So the apparent confusion over event type and phase is simply to
 facilitate automatic generation of mouse-like events. OK.

There's no specific touch-event-type in osgGA::EventType, the current
implementation uses PUSH for touchBegan, RELEASE for touchEnded and DRAG
for touchMoved. The emulation sets only the mouse-button to the left
button to satisfy all the mouse-centric event-handlers.

 Question: Suppose we have this situation:
  - Touch point 1 starts (we call touchBegan)
  - Touch point 2 starts while point 1 moves (we call touchMoved, with
 different phases for each point)

I'd call touchBegan as a new touch began. (and this is how the
ios-implementation works)

  - Touch point 1 ends but touch point 2 continues to move (we call
 touchEnded, again with different phases for the two points).
  - Touch point 2 continues to move. How do we send this event? Clearly
 we don't call touchMoved because this has nothing to do with mouse
 button 1 (if I understand you correctly).

as it is a touch-moved event I'd call touchMoved (this is how the
ios-implementation works) -- the touch-moved has only one touch-point
attached.

  - Touch point 2 ends -- same question, how do we send that event?

as the particular touch ends, we should call touchEnded with one
touch-point.

I know this sounds all a bit curious. Some background: there a two ways
to handle multiple touch points:
a) a stream of single events, where every event carry the state of one
touch-point.
b) one event with all available touch-points encapsulated in a custom
data-structure

I used b), because it makes it easier to work with multi-touch-events in
the event-handlers as with solution a) you'll need some special events
to signalize the start and the end of a stream of multi-touch-events.
And you'll need some logic, storage and state-handling in your
event-handler, it has to capture all touch-points for a specific frame,
store them and process them, when all touch-events arrived. I thought
that this is too much hassle, and implemented b)

So there's a high chance that you'll get a stream of events like this:

PUSH (1 tp) - DRAG (1 tp) - PUSH (2 tp) - DRAG (2 tp) - RELEASE (2 tp) -
DRAG (1 tp) - RELEASE (1 tp)


 (The whole implicit support for mouse events seems not just confusing
 but unnecessary. Presumably apps that want to emulate the mouse would
 just add mouse events at the time they add multitouch events. Just my
 opinion.)

The only emulation is that the _button-member is set to 1.

Perhaps we should make it optional, so developers can disable this
behavior. It's a nice helper for all the examples, and you'll get single
touch out of the box, and it does not interfere when you do only
multi-touch, as your event-handler should check only for
isMultiTouchEvent() and work with the osgGA::GUIEventAdapter::TouchData.

So the event-handler for multi-touch looks something like this:

switch(ea.getEventType())
{
  case osgGA::GUIEventAdapter::PUSH:
  case osgGA::GUIEventAdapter::DRAG:
  case osgGA::GUIEventAdapter::RELEASE:
if (ea.isMultiTouchEvent())
{
  osgGA::GUIEventAdapter::TouchData* data = ea.getTouchData();
  // check all the touch-points and do something
}
break;
  ...
}

Please note, that this is a first version of the multi-touch-handling,
so we have something we can talk about and find a good solution which
suits other multi-touch-implementations, too. I used the event-handling
on ios-devices as a guideline.

cheers,

Stephan
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Generating multitouch events

2011-02-03 Thread Paul Martz

On 2/3/2011 12:31 PM, Stephan Huber wrote:

No problem, perhaps we should add it to the wiki :)


Having it in the wiki wouldn't hurt. It's here now, along with your link 
pointing to the osg-submissions discussion, so anyone scanning osg-users for 
multitouch will find it.


You might consider commenting the EventQueue header and add an explanation of 
why you would want to use which convenience routine, and explain how multitouch 
overrides mouse events. I usually look at the header files for guidance first, 
then osg-users archives.



Question: Suppose we have this situation:
  - Touch point 1 starts (we call touchBegan)
  - Touch point 2 starts while point 1 moves (we call touchMoved, with
different phases for each point)


I'd call touchBegan as a new touch began. (and this is how the
ios-implementation works)


OK. I guess it's still kind of ambiguous, to me at least, which EventQueue 
convenience method I would want to use. But I guess the bottom line is: What 
type of mouse event would I also want to associate with this touch event? (Is 
that correct?)



I know this sounds all a bit curious. Some background: there a two ways
to handle multiple touch points:
a) a stream of single events, where every event carry the state of one
touch-point.
b) one event with all available touch-points encapsulated in a custom
data-structure

I used b), because it makes it easier to work with multi-touch-events in
the event-handlers


I, too, would go with option b. But I would've done it using a USER event and a 
derived class, in a separate library, without changing core OSG, to allow 
backwards compatibility with older versions of OSG.


And I would not have overloaded the mouse events with the multitouch events. A 
mouse is a mouse and a touch is a touch, and the application generating the 
events can generate one, or the other, or both, so that would've been a more 
flexible way to do it, in my opinion. Tools, not rules as they say in X Windows.



(The whole implicit support for mouse events seems not just confusing
but unnecessary. Presumably apps that want to emulate the mouse would
just add mouse events at the time they add multitouch events. Just my
opinion.)


The only emulation is that the _button-member is set to 1.


...and the event type is set to the same type as you would expect for a mouse 
event, thus overloading PUSH, DRAG, and RELEASE.



Perhaps we should make it optional, so developers can disable this
behavior. It's a nice helper for all the examples, and you'll get single
touch out of the box, and it does not interfere when you do only
multi-touch, as your event-handler should check only for
isMultiTouchEvent() and work with the osgGA::GUIEventAdapter::TouchData.


I think all of that could have been achieved by having the code generating the 
events decide to generate touch, or mouse, or both. (Pen events, for example, 
don't overload mouse events.) But you can do this however you wish on trunk, as 
my project is on 2.9.10 and so we have what we have. I just wanted to point out 
what appears (to me at least) to be a confusing API and a simpler way to obtain 
the same results.


But definitely thanks for your explanation, I appreciate it. Your explanations 
make it possible for me to understand the API enough to complete my project.

   -Paul
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org