Hi Michael, Not yet, as far as I am aware, but it was written with that in mind. Some of classes needed for this are ready.
User Interfaces are attached to a UIDrawingSurface. The UIDrawingSurface is used to draw the UserInterface. Currently there are two classes that can draw a UIDrawingSurface within OpenSG. - UIForeground- is an OpenSG Foreground that can be attached to a viewport. This is meant to be used by single window applications, so it is probably not useful for multi-display or distributed systems. - UIRectangle - is an OpenSG Core that renders the UI to a rectangle in the scene. So UIRectangle can be used as a core to a node that is part of the scenegraph. - There are 3 tutorials that show how to use UIRectangle: 20UIRectangle, 21ExampleInterface, and 49LookAndFeel There are limitations. The UIDrawingSurface receives mouse, keyboard, and window events from a WindowEventProducer. The WindowEventProducers have different implementations depending on the window system: win32, x11, or Carbon. Currently there is no WindowEventProducer for handling distributed applications. The current plan to have a WindowEventProducer that is useful in a distributed environment is to write a PassiveWindowEventProducer. This WindowEventProducer would not connect to the underlying window system to capture mouse, keyboard, window events as this doesn't make much sense given there may be >1 window showing the scene. So to use the PassiveWindowEventProducer, application writers would need to define how they want to simulate mouse, keyboard, and window events in the distributed system. Example simulation of input for distributed system. - Mouse - If the VE system uses a 6-dof tracker that the users can manipulate in their hands, then the mouse position can be simulated as the intersection of the ray cast from the tracker with the UIRectangle containing the UI. If the system has buttons on the tracker, then these buttons could be mapped to button1,button2, etc. Something similar could also be done for the mouse scroll wheel. In addition you may want to create a cursor in the scene that follows this intersection. - Keyboard - Given that users would probably not bring a keyboard device with them into the environment, this may be difficult. In this case you may want to create a text input UI, similar to Cell phones without physical keyboards, or game consoles, where the mouse input could be used to press buttons on the UI with Keys associated with them. - Update- The WindowEventProducer would also need to receive update events with the elapsed time since the last update so that time dependent behavior of the UI can operate correctly. e.g. double clicks, tooltip popup time. I plan on doing this at the lab I work at this next month, and will add the PassiveWindowEventProducer to the repo soon. When I have it done, I will make the example code available. In the meantime, please let me know how you feel about this approach, and perhaps features you may like for your own approach. Thanks, David Kabala On Wed, Apr 28, 2010 at 1:39 AM, Michael Raab <michael-r...@gmx.de> wrote: > Hi David, hi all, > > has someone used the UserInterface library included in OpenSGToolbox in > some kind of virtual environment? David, does the library support this in > general or are there some constraints we need to know? > > Best regards, > Michael > > -- > GRATIS für alle GMX-Mitglieder: Die maxdome Movie-FLAT! > Jetzt freischalten unter http://portal.gmx.net/de/go/maxdome01 > > > ------------------------------------------------------------------------------ > _______________________________________________ > Opensg-users mailing list > Opensg-users@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/opensg-users >
------------------------------------------------------------------------------
_______________________________________________ Opensg-users mailing list Opensg-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/opensg-users