Hiya Folks,

LiveCode firsttimer here, but have been following this great tool for a year or 
so - just got started a month or so ago.

I'm hoping that there's some experience that can help me on this critical 
feature that I need to sort to move ahead a bit faster than what the last 
couple of weeks has allowed.

As a 'get-started' / performance testing project I've been attempting to create 
a simple implementation of the following two features on a simple user 
interface:

1. 6 x buttons with down states (PNG icons)

2. MouseDown event handler which triggers a SHORT .WAV file with little or no 
latency, and allowing for rapid 'tapping' - so that the button may be used as a 
virtual percussion instrument, to be played along with a looping .MOV file, 
triggered independently.

LiveCode IDE: 4.5.1, MacOSX 10.6.4
Total audio footprint: 445kb of WAV files, average file size 70KB

Sound simple?

Well in theory it is. The challenge is the 'little or no latency' bit. Here's 
the different configurations I've implemented so far:

1. Buffered, offscreen/invisible 'player' object to house each audio file as a 
.mov, send 'start' command directly to object from MouseDown event on same 
card. Result = slow, not snappy.

2. AudioClips (imported to stack, not referenced) triggered directly from 
mouseDown events. Result = less slow, but, not snappy.

3. Purchased and Implemented FranklinAudio external (recently released, based 
on OpenAL) with buffers for each sound mapped to single source for playback, 
with "alSourcePlay" function triggered from MouseDown handler. Result = per (2) 
above - for 50 times the amount of code.

4. FranklinAudio external with buffers for each sound mapped to multiple 
(spawning) sources for playback, with "alSourcePlay" function triggered from 
MouseDown handler (delivers cascading, layered playback of audio - which is 
cool side effect) Result = per (2) above - for 100 times the amount of code. 
(Franklin Audio actually looks really great in terms of extended audio 
capability, outside of this scenario, and I may not have fully optimised the 
best implementation in this tool).

Key problem? All of these implementations deliver repeatable, less than 
desirable responsiveness from the button that is being clicked by the user to 
trigger the audio. It's just initiating playback too slowly to cut it as a 
playable 'instrument'. I would also be interested in how to improve this 
latency challenge from a general UI perspective anyway (using audio click 
sounds on buttons etc...)

In contrast, i've been able to achieve much better (acceptable) performance of 
exactly the same basic functionality in:

1. Flash - (but this is not desirable for distribution requirements and other 
features of project - so no can do... except MAYBE in revBrowser) :(

2. HTML5 + JS with new <audio> & <video> tags(very impressive results in Safari 
on OSX + Firefox, with particularly surprising performance on old WinXP running 
FF) Alas, web delivery a harder option in this project.

3. iShell - blast from the past, but still performs very well on this 
particular function, on both mac + win

So, I'm wondering if an area that is harder for me to test is perhaps the 
issue:  the internal latency of the MouseDown event listener in LiveCode and 
the LiveCode runtime.

Any thoughts / experience round the table?

Happy to share the prototype stacks etc... if you think relevant.

Warm regards,

Anthony._______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to