Hi!
So, let me see if I understand this correctly -- you've got object in-world in SL that are the "recipients" of this BFB data, and you have BFB devices plugged into a home computer that are input devices. You want to get data from the BFB device on your PC (or whatever) and eventually to an object written in LSL running on a sim.
The current way you are doing this (correct me if I'm wrong) using an LSL script to connect over HTTP to a "server" running on the user PC, and the HTTP server spits out data from the BFB box using "say" to other scripts inside SL.
It sounds like this works, but you are having problems with latency (BFB to HTTP server to "master object" to "required object in-word" to some action in-world back to the user's SecondLife client, or what)? and also maximum transmission rate?
Further, it sounds like what you'd really like is to be able to somehow use that BFB box to directly create messages that your SL client would "say" -- kind of to replace the keyboard?
If all of that is correct, there are a few different ways you could go --
1. Write a program to take the output of your BFB and convert it into keystrokes, and send those keystrokes to SecondLife.exe 2. Write some code to inject packets into the output of the SecondLife client with "say" content -- not ready yet, but Austin Jennings is working on some code to make that easy 3. Write a new, completely separate client for secondlife using libSL, that would directly communicate with your BFB and the sim -- this may or may not be appropriate. Do you need / want a full graphical interface with SecondLife at the same time. If so, you might still be able to do so, but you'd have to create a second "dummy" account and have them both logged in at the same time, on the same sim.
Have I correctly understood your goals here? Ben
On Jul 23, 2006, at 6:30 PM, T. S. Kimball in SL wrote: *wave*
For those not on the IRC channel, my name is Tim Kimball (SL primary name is Alan Kiesler, but I'm not on very much anymore). I'm not a coder by trade, but as a RL Solaris Admin I've worked alot with the csh shell scripting, and dabbled in come C.
I've helped the InnerLife Biofeedback project in the past, mostly by supplying land for their use while I was renting a quadrant in Tavarua. But in following this project, I think InnerLife has a chance to use a portion of what's been learned so far for its benefit.
Current method of getting a biofeedback device to interact within SL is to have a master object open an XML-RPC channel and a written client app on the PC/Mac (one version was Java-based) send to the opened channel at a specific rate. The master object would then take the data packet sent, do transformations if needed, and create a 'say' on some channel which would be intrepreted by the required objects in-world.
This is the normal way most people do external comms in SL (before HTTPRequest), though in the case of InnerLife its a realtime feed for a period of time - I've personally seen it up to about 20 minutes. The big issue in using this format is an unknown latency problem; People had different success rates in regards to how fast you can send that data without lagging out the external channel (average was 4 mesgs/second, which I believe was eventually adopted as the norm). There are still issues at times with that data stream depending on the client's geo region IIRC, but not as bad; The big stopper really is the complex issues in having to use XML-RPC (there were plans to evenally do group projects, but aggregating several streams is kinda hard on either the sim server or some external site).
Well, one aspect of the libsl project has come to mind, which should help this and similar projects greatly. Quoting from my post on their forum:
[InnerLife] could take advantage of this, by having a small app that becomes a direct relay between a BFB application and the SL cilent. This app would take the data packet to be sent and turn it into 'say' messages direct from the avatar to a specific channel (preferably in a configurable way).
This, in effect, does two things:
1) Gets around the XML-RPC latency/throttle problem.
2) Implement our own method of integrating devices other than kboard and mouse, without waiting for LL to do it.
No one as yet has determined if there is (or will be) limits on how many 'say' commands can be sent; For the sake of keeping LL off our backs, it may be best to force the app to send no more than 5-6 a minute.
My post there references this list and the IRC channel, and I'll be checking in with the two main devs probably this coming Friday.
Innerlife Site: http://gaeacoop.org/cgi-bin/InnerLife/index.cgi
--TSK
-- Timothy Kimball aka Alan Kiesler Semi-retired SL Resident
_______________________________________________ libsecondlife-dev mailing list |