Without much eyeballs on the code yet, it appears newview/llvoiceclient.h/cpp can be pretty much shallow copied and used to get a head start on this.
* Personally, I would do this copy and replace the vivox specific IO with the standard XML interpreter (if not already done). * Even if llgestureclient.h/cpp seems like the logical new name for it, I think I would start off with llgenericxmlclient.h/cpp and derive from there. Some viewers already have a socket interface to local apps, maybe they already have done above to some degree and can easily change it to meet such criteria of XML. In my viewer, I have the option to either support an API (to libsnowglobe.so) that handles the dynamics of UTF16 or UTF8 across platforms, which would mean duplicate methods for both cases, platform specific selection messes, and more just to access the LL-API. It would be easier if I just used the XML client interface, which would let the XML interpreter do the job to detect and translate between the wide variety of different UTF formats and not worry about word size or endianess. I may have something to submit. =) Philip Rosedale wrote: > It would be fabulous if someone here wants to do a patch for > Snowglobe that does the viewer side of this (see doc), which is to > listen for local network packets and trigger gestures and set lookAt > based on the contents of those packets. _______________________________________________ Policies and (un)subscribe information available here: http://wiki.secondlife.com/wiki/SLDev Please read the policies before posting to keep unmoderated posting privileges