*wave*

For those not on the IRC channel, my name is Tim Kimball (SL primary name is Alan Kiesler, but I'm not on very much anymore).  I'm not a coder by trade, but as a RL Solaris Admin I've worked alot with the csh shell scripting, and dabbled in come C.

I've helped the InnerLife Biofeedback project in the past, mostly by supplying land for their use while I was renting a quadrant in Tavarua.  But in following this project, I think InnerLife has a chance to use a portion of what's been learned so far for its benefit.

Current method of getting a biofeedback device to interact within SL is to have a master object open an XML-RPC channel and a written client app on the PC/Mac (one version was Java-based) send to the opened channel at a specific rate.  The master object would then take the data packet sent, do transformations if needed, and create a 'say' on some channel which would be intrepreted by the required objects in-world.

This is the normal way most people do external comms in SL (before HTTPRequest), though in the case of InnerLife its a realtime feed for a period of time - I've personally seen it up to about 20 minutes.  The big issue in using this format is an unknown latency problem; People had different success rates in regards to how fast you can send that data without lagging out the external channel (average was 4 mesgs/second, which I believe was eventually adopted as the norm).  There are still issues at times with that data stream depending on the client's geo region IIRC, but not as bad; The big stopper really is the complex issues in having to use XML-RPC (there were plans to evenally do group projects, but aggregating several streams is kinda hard on either the sim server or some external site).

Well, one aspect of the libsl project has come to mind, which should help this and similar projects greatly.  Quoting from my post on their forum:

[InnerLife] could take advantage of this, by having a small app that becomes a direct relay between a BFB application and the SL cilent.  This app would take the data packet to be sent and turn it into 'say' messages direct from the avatar to a specific channel (preferably in a configurable way).

This, in effect, does two things:

1) Gets around the XML-RPC latency/throttle problem.

2) Implement our own method of integrating devices other than kboard and mouse, without waiting for LL to do it.

No one as yet has determined if there is (or will be) limits on how many 'say' commands can be sent; For the sake of keeping LL off our backs, it may be best to force the app to send no more than 5-6 a minute.

My post there references this list and the IRC channel, and I'll be checking in with the two main devs probably this coming Friday.

Innerlife Site:  http://gaeacoop.org/cgi-bin/InnerLife/index.cgi

--TSK

--
Timothy Kimball aka Alan Kiesler
Semi-retired SL Resident
_______________________________________________
libsecondlife-dev mailing list
libsecondlife-dev@gna.org
https://mail.gna.org/listinfo/libsecondlife-dev

Reply via email to