Interesting...

Indeed, this is something we wanted to do in AGISim, but didn't get to
yet...

Furthermore, this is the sort of thing that is irritatingly difficult to do
in Second Life, because of the way avatar movements work in SL... (thru the
external API you don't get information on bone positions, only on what
animations are being run at a certain point in time).  But, you could do it
via scraping the visual info that comes thru to the client, and doing vision
processing on it ... or else wait for the server to get open-sourced ... or
use a SL clone like HiPiHi, which reportedly will in early 08 offer a Java
API including information on skeleton position...

-- Ben



On 10/25/07, Joel Pitt <[EMAIL PROTECTED]> wrote:
>
> Somewhat apropos to what Novababy was planning to do in AGISIM:
>
>
> http://www.pinktentacle.com/2007/10/android-acquires-nonverbal-communication-skills/
>
> -J
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems?  Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> Agisim-general mailing list
> [EMAIL PROTECTED]
> https://lists.sourceforge.net/lists/listinfo/agisim-general
>

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=57736279-f4b857

Reply via email to