"Tim E. Real" <[email protected]> schrieb:
>> 
>> what about sub-ticks? floating-point ticks?
>
>A tick is the minimum unit. 

currently, but we could change that.
then for audio parts, begin would be stored as unsigned tick and float 
subtick_offset (which is in [0;0.99999])

then the beginnin sample would be calculated using
 tick2frame(tick) + 1/(bpm at tick)*const*subtick_offset

where const is a constant which converts bpm into ticks per second, and 
(milli)seconds into audiosamples

optionally, of course.
the user may switch whether he wants all in samples.or all in ticks (plus 
subticks for audio)

this allows for both: accuracy and not getting out-of-sync when the tempo map 
changes.


>Yes. That would be a goal. Both midi and audio events aligned to
> either ticks or frames.

ack. but please with subticks for audio, for not losing accuracy.


>Yes, all these converters take a lot of processor time.
>When I had the SRC code running, just a few tracks were enough
> to slow my machine down.
>Similar to adding OGG or FLAC files to MusE. I mentioned these 
> take up a lot of time too.

okay. so can we agree with "disk space/memory is cheap, cpu is expensive"?
on the long term, i'd like to ask the user how aggresively he wants to cache 
(depending on cpu power, space availablr and user patience)


>
>> 
>> > When a wave is added or recorded in MusE, the wave event will need
>a
>> > 
>> >  to store with it a 'snapshot' of the complete tempo map at the
>moment
>> >  the wave was added.
>> 
>> exactly. Or it can store a *different* tempo map, that is what i
>meant
>> with my "tapping" suggestion:
>> Imagine you have a wave file which does not fit your tempo. Then just
>> store the *appropriate* tempomap with it and it will be fine.
>
>True. When recording a wave in MusE then the tempo map at that moment 
> is its tempo map. But when importing some wave it might not have 
>consistent tempo, or changing tempo, so I guess you're right, one would
>need to either tap the tempo out or have something automatically try to
> determine its tempo(s).

i have code for autodetermining a constant tempo; but that's not conceptual 
critical and stuff for the future, agree?


(snip: lots of stuff about file format, separating things and how to store the 
tempomap)
honestly i don't really mind how files are stored. whatever you want to change, 
feel free to make it sane :)
this is anyway hidden behind a layer of abstraction.

just about recording the tempomap as wave: please don't store the tempo of my 
midi pieces with ~30 tempo events over 5minutes as 44kHz wave file (not even as 
2kHz wave).




>> 
>> > Anyway, further influence on the stretching could come from
>> > 
>> >  some audio automation controllers.
>> 
>> how do you imagine that? i don't think we want stretching in
>automation,
>> because it is hard to map.
>
>Er, well, I just meant the user might want to manually have some
>influence on the stretching. ie purposely slow it down or speed it up
>etc.
>

yea the user shall be able to change the tempo of wave events. but i think this 
should be done via the event's tempomap (as discussed above)
controllers are unsuitable herr, imo.


ps: can we manage to have a synchronous chat about these topics, e.g. in IRC?

greetings
flo
-- 
Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail gesendet.

------------------------------------------------------------------------------
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
MVC, Windows 8 Apps, JavaScript and much more. Keep your skills current
with LearnDevNow - 3,200 step-by-step video tutorials by Microsoft
MVPs and experts. ON SALE this month only -- learn more at:
http://p.sf.net/sfu/learnmore_122712
_______________________________________________
Lmuse-developer mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/lmuse-developer

Reply via email to