On Feb 2, 2011, at 6:23 PM, CecilBergwin wrote:
> Currently as you are are aware there a a few already used, such as
> Walk/Idle/Fly/Crouch, those are obviously in the *.Skeleton file for
(...)
> find no link from "Key_Press" to "Skeletal animation *.skeleton, a few
> people have said that the UUIDS are embedded for animations which I
> have search the Server Database and not found any relevant UUID's used
> for the given animations in the default avatar.
(...)
> desperately now want to add these animations for our "Users" so we can
> have Gestures/Dances etc.

Apparently what you are talking about there are two separate things in LLUDP: 
animations associated to movement, and gestures.

Triggering gestures over LLUDP from Naali is currently simply not implemented 
at all. This is what the 0.x version number is for :) At least based on the 
Linden message template (in bin/data/message_template.msg in Naali repo), the 
corresponding messages from viewer -> sim to control gestures are 
ActivateGestures and DeactivateGestures. I don't know, and possibly no rex dev 
knows, what the Flags and GestureFlags there mean -- sometimes we read the 
opensim source or get help from opensim devs to figure out what those 
undocumented things are. The message structure is copy-pasted from the template 
to the SL wiki, but there is no usage docs there either: 
http://wiki.secondlife.com/wiki/ActivateGestures

If you figure out what data LL viewers are supposed to send in those packages, 
adding the sending is quite simple in Naali c++. The message id is already 
defined in RexProtocolMsgIDs.h as RexNetMsgActivateGestures. Add 
SendActivateGestures method in ProtocolUtilities/WorldStream.h and .cpp. Put it 
in the public slots part in the .h so also Python and Javascript can call it, 
so you can make the UI and keybindings for triggering gestures using either of 
those if prefer.

I guess the matching of gestures to animations is done on the server side. The 
handling avatar animation playback commands from server is implemented in 
Naali, in AvatarModule/Avatar/AvatarHandler.cpp HandleOSNE_AvatarAnimation. 
That AvatarModule is actually a part of RexLogicModule, but split to a separate 
submodule 'cause RexLogic came so big. OSNE means opensim network event. That 
AvatarAnimation handler gets the Linden message as data and apparently finds 
the right anims based on UUIDs. The mapping of the Ogre animation names to the 
Linden UUIDs is made in the rex avatar xml: 
https://github.com/realXtend/naali/blob/develop/bin/data/default_avatar.xml in 
the lines like <animation name="Sit" id="1a5fe8ac-a804-8a5d-7cbd-56bd83184568" 
internal_name="SitOnObject" looped="0" fadein="0.4" fadeout="0.5" />

With Tundra nothing about avatars or animations is hardcoded in the platform, 
there are no specific messages about them in the base protocol etc. Instead, 
there is a basic Javascript implementation of avatar functionality as that you 
can use in any Tundra world. It contains both the client and server side parts, 
and some of the code is executed on both sides.

There the server side function to handle gestures is 
https://github.com/realXtend/naali/blob/tundra/bin/scenes/Avatar/simpleavatar.js#L286
 ‚ ~10 lines of code :) The client side mapping of a key to a gesture is 1 
line: inputmapper.RegisterMapping("Q", "Gesture(wave)", 1) at 
https://github.com/realXtend/naali/blob/tundra/bin/scenes/Avatar/simpleavatar.js#L476
 (yes, in the same file). That Gesture is an Entity-Action which is 
automatically sent over the net, and you can use any string to define your own.

So here are your options -- implement sending the commands with LLUDP, or 
evaluate if Tundra could work for your application. Both ways support bringing 
in the content as Ogre .scenes, the same ECs that you've tested for the radio 
stream playback etc. are available etc. Tundra doesn't currently have 
authentication, if you need that, but adding some simple auth there (perhaps 
even in e.g. py or js if it's just a matter of adding a client connection 
handler) may be easier than doing the LLUDP biz in c++. And I guess someone 
needs to add those basics soon anyway, though many apps just use anon anon 
access (like web sites typically and i suppose your internet radio too). So I 
figure it depends on what other Opensim feats you might need, like LSL or 
compatibility with Linden based viewers like Imprudence etc. Testing Tundra is 
easy 'cause it doesn't require databases or anything like that, you just run 
one executable like normally with Naali.

Actually, it might be possible to use the same scripting level EC style 
animation control with Taiga as well. Entity-Actions are not currently 
implemented there so the Tundra AV code would not work. But I think this hack 
with EC attribute sync would: add a EC_DynamicComponent called 'gesture', with 
a single string attribute that is the Ogre name of the animation that is 
activated. Write a py or js script that listens to attribute changes, and 
triggers the playback of the animation. The name of the previously played 
gesture would just sit there also after the animation was played, but that's 
ok. You can see bin/pymodules/apitest/animsync.py for an example how to listen 
to EC attribute changes in your own component and select the Ogre anim to be 
used by using AnimationController. That code uses 
animationcontroller.SetAnimationTimePosition 'cause is for controlling the 
animation directly -- for triggering playback of gestures, use PlayAnim as 
documented in 
http://www.realxtend.org/doxygen/class_e_c___animation_controller.html . See 
door.py in the same place for an example of adding GUI buttons that change EC 
attribute values, but first you can test just by typing the gesture names in 
the EC editor.

So three options!

In any case some nice UI for triggering and controlling gestures would be cool, 
though I guess the basics: a) hotkeys b) menu c) /wave etc. typed to chat would 
take us a long way.

> Cecil

~Toni

-- 
http://groups.google.com/group/realxtend
http://www.realxtend.org

Reply via email to