Hi guys

I'm busy creating a variometer for FlightGear.

My instrument needs to be able to :
1. Display total energy (using some maths I haven't figured out yet)
2. Play sounds (audio cue)
3. Accept user input to its 2 knobs and 3 toggle switches.

From what I've seen in FG I would have to generate the total energy property  
from a nasal script, play the sounds from the aircraft XML file and accept 
the user input through the panel hot spot (actions) config file.

I have one short word to describe this affair : "Mess!"   :)

Most of the instrument has to be coded into the aircraft config files 
(although none of it is aircraft model specific) and for every aircraft that 
I want to install the instrument in I would have to duplicate the sound, hot 
spots and nasal code.

Is there not a better way of doing it?

Also is there not a way to accept user input on a 3D instrument?
I don't see the logic behind specifying hot spots using 2D panel *pixel* 
locations for a 3D model which is placed using 3D coordinates.
Not to mention that as soon as I move my 3D instrument around in the cockpit 
the input is "lost". Surely the input should be part and parcel of the 

The way I see it is an instrument should be able to have it's own set of 
animation, input and sound config files as well as nasal scripts.
Then only a single include has to be done in an aircraft config file to load 
the instrument at the right location.


Flightgear-devel mailing list

Reply via email to