Hi all,

i'm just trying to implement the time-lock feature for tracks (in the 
audiostreams branch); i want to have it for both audio and midi tracks, because 
they don't differ for that means.

However, there's a problem with the Pos class:
both Part and EventBase are derived from Pos. While this is perfectly fine for 
Part, it is a bit of a lie for EventBase: because Pos offers tick/frame 
conversion functions (with caching) which rely on the Pos to be an *absolute* 
position. But EventBases are not absolute, but relative to their parental Part!

I can see two ways out of this:
Either on every process() run, we on-the-fly calculate the Events' absolute 
positions (ev.pos()+part.pos()), and use the .tick() or .frame() function of 
it, regarding what we need (in the long run, i'd like to use FRAMES in the 
audio driver for both audio and midi, but that's not really relevant now)
This would however make the Pos' class'es tempomap result caching mechanism 
obsolete. Because we're using frames()/tick() only on temporary Pos objects, so 
we need to do tempomap.tick2frame() quite often *on every process() run!*
pro/contra:
+ stays compatible with file format
+ hopefully does not involve too much changes
- very frequent tempomap queries (speed problem??)
- possible accidental misuse of EventBase::frame()/tick() (they lie!)

Or we would store the Events with their absolute position. (This is possible, 
because clone parts no longer share the same eventlist, but have their own 
instead.)
This forces us, however, to move *every single Event* on a Part, when we move 
this Part. Note that AudioStreams or at least WaveEventBase must be aware of 
the event's absolute position anyway (for  stretching and seeking, both must 
query the tempomap). Currently this is done by keeping a parental part pointer, 
which is not a problem at all (just a bit unclean).

+ very easy interpretation of a Event's position, no need for parental part.
+ tempomap lookups can still be cached (although the cache should be 
recalculated on changes, and not "when necessary", as the latter violates 
realtime constraints)
- file format anfd copy/paste issues: need to convert from/to relative times 
here.
- possibly huge amounts of changes during part movement (move every event); 
might violate realtime constraints, but only while editing on purpose.
And frankly, moving an AudioStream while playing violates realtime constraints 
anyway though buffered by the AudioPrefetch, but this causes other issues... 
Just don't do that.

I find the second solution cleaner, though possibly more work-intensive. But i 
hope the cleanness pays out one day...


Please discuss this. Which solution would you prefer? Tim, haven't you already 
thought about time locking some time ago?

cheers,
flo
--
This message has been sent from my smartphone. Please pardon any typing errors 
that might have occurred.
------------------------------------------------------------------------------
October Webinars: Code for Performance
Free Intel webinars can help you accelerate application performance.
Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from 
the latest Intel processors and coprocessors. See abstracts and register >
http://pubads.g.doubleclick.net/gampad/clk?id=60134791&iu=/4140/ostg.clktrk
_______________________________________________
Lmuse-developer mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/lmuse-developer

Reply via email to