> Date: Thu, 23 Sep 2010 03:06:23 +0000 > From: Campbell Barton <[email protected]> > > If you need to deal with event system internals we have a > low level > library called GHOST, you'd probably need to use this if > you wanted to > add a new input device or event types. > But this is really low level and I dont think its a good > starting > point unless your main goal is to extend event system > code. > > Id suggest looking into existing modal operators, these are > tools > which take user input, and handle their own events. > > Examples are the knife tool, viewport manipulation > pan/zoom/orbit, fly > mode, border select and painting tools. > These can be written in C or Python: > The view manipulation operators are in view3d_edit.c. > > A python view-editing-template can be accessed from the > text editor, > Text -> Script Templates -> Operator Modal Draw. > To run it first tun the script to define the operator, then > use the > spacebar operator search menu in the 3d view to execute > it. > See other view properties: > http://www.blender.org/documentation/blender_python_api_2_54_0/bpy.types.RegionView3D.html
Thanks for your advice, Campbell :) I think at this stage I will try to tackle from Python scripting first (the templates are really helpful) and get reference from the existing source codes handling input devices to find a way to communicate between Blender and the external system I tried to build. > Date: Thu, 23 Sep 2010 14:47:48 +0400 > From: Sergey Kurdakov <[email protected]> > Hi Anthony, > > >on creating a new interface that can > >translate the real-life actions into 3D coordinates. > > You couldĀ be interested in this project - > http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO it > adds 3D input from > screen, the code is on the site - so you might look at what > they changed. > > Regards > Sergey Wow... that's a nice reference. Thanks for sharing the link, Sergey. Anthony >> On Wed, Sep 22, 2010 at 11:28 PM, Anthony LAU <[email protected]> > wrote: > > Hi! > > > > ?I'm currently doing a academic research on creating a > new interface that can > > translate the real-life actions into 3D coordinates. > To test the efficiency of > > such system, it's best to test it with some existing > 3D programs. > > > > ?THis makes Blender as my first choice as I could > access the source codes for > > ease of customizing the interface with Blender 3D > environment. > > > > ?I've already downloaded the SVN and is able to > compile it in either 32-bit or 64-bit > > Window platform. I'm also trying to read the online > documents to look for clues how to > > program such control interface in Blender. > > > > ?Nonetheless, I'm still a bit lost on where I should > start from, e.g. like which part of > > Blender is dealing with cursor control, handling mouse > events, etc. > > > > ?Would anyone mind giving me some clues or guidelines > where I should start to look > > into on the programming the input interface? Thanks. > > > > Anthony _______________________________________________ Bf-committers mailing list [email protected] http://lists.blender.org/mailman/listinfo/bf-committers
