Robert wrote: > On a related note, since we already have stuff in trunk that aren't > released this iteration should really be as short as possible. > Aim for 2-3 months perhaps (then it'll be 4) > October 1:st could be our aim, just to pick a date, perhaps even sooner... > > The feature (wish) list I've compiled so far goes like this: > - Already added features, new drum editor? some editing changes? > - Trim volume (not affected by automation, to calibrate levels) > - Improve automation editing, multi-point move, copy, paste, > - Batch create dialog for easy setup of tracks/inputs/synths etc > - LV2 > - Harmonized midi strip > - fix known bugs > > The list goes on forever so it's more a question of what people are > interesting in working with. I thought I would take a stab at > improving automation and the volume trim myself. > > Regards, > Robert >
On July 22, 2012 4:28:39 PM Florian Jung wrote: > Hi > > another though for 2.1: > > I'd like to remove all the design flaws from MusE for 2.1, and do all > the design changes i deem neccessary/useful for new features > > new drum tracks: already implemented > remove the song type; a song is never GM or GS or XG. an instrument / a > port is. > my "prerecording feature" will need some design change, but i'll do this > anyway ;) > logical->physical device mapping with symbolic names: you don't connect > to "port 2", but to "the KORG left to me"-synth. > songs store "track "blah" uses "the KORG left to me" > the MusE-Configuration stores ""the Korg left to me" is the device > AlsaMidi->some_hardware_piece" > > this simplifies moving songs between machines; you only need to set up > the symbolic names on each machine once, and then it just works. > currently, you have to change each song. > > can you agree with this? > ideas/criticism? > > greetings > flo These goals are awfully ambitious. I'd say more like 3.0 material. A few months I doubt. I know what we're all doing - racing ahead like cars out of a gate with stuff we've been salivating over. But take it easy. I'd advise all of us to please create branches for this work instead of directly in trunk. If it's a sort of "all wrapped up, tested, and no harm done" type of fix or feature, Ok, but anything else watch out... Especially when you talk about removing stuff. I think there's some great potential for a jam conflict with all of us going full-ahead. I've got three branches I'm working on: 1) The configuration fixes. This was, kinda going well but the sort of changes I was faced with when I last left it were radical and I just didn't have time to finish it. I fear I may not get the courage to continue it, much less remember what I was doing. We'll see... 2) The infrastruct_1_poc branch: Good news: I merged trunk into this 9-month-old branch and work is continuing. After merging I should have committed but I kinda went beyond a bit and can't commit right now. This is the branch which radically alters the way canvas items are stored and drawn, with 'drawing layers' so that part, events, automation, text overlays etc. are all CItems now, meaning they could be copied/pasted etc. Well it's definitely not at that point yet. I'm at the point where I can move forward, but there are now puzzling challenges but I'm working on it. 3) Audio latency: All you audio fans out there, (and midi fans - Jack midi input has latency!), will appreciate this rather serious problem in MusE: Recorded waves have latency - they are positioned at the wrong frame, which can be thousands of frames off. I reeeeally want to take care of this as I want to be able to record with very large Jack buffers to allow everything to go at once - all hungry plugins, tracks etc. I've spent the last two weeks planning and thinking and whew, let me tell you it's extremely challenging. Basically I would like to make MusE automatically compensate for latency, behind the scenes. This includes Audio Input latency, rack plugin latency, and Audio Output latency including external 'soloing chain' paths. I found that it is going to involve lengthening the data buffers stored and passed among the node copyData and addData and corresponding track getData. Because, for example, data 'gathered' from an Audio Input actually arrived at the physical terminal one period ago, or even earlier. Same with Jack midi inputs, but that's a slightly different story. It means that these lengthened data buffers require processing over /more/ than just one period. It means that audio sent to an Audio Output will necessarily /have/ to be delayed even more while we wait for two or more periods of data to be gathered. In fact it may seem counterintuitive but in order to correctly align various playback waves, /more/ latency must be added in places. Here's one of the challenges: Currently during one process call MusE simply takes all gathered data and lumps it into a single one-period wide buffer and sends it off to an Audio Output. This is good because we want the audio as soon as possible without further delay, but unfortunately it's wrong. Some of the gathered data had latency and should be compensated for. Unfortunate also because this 'lumping' of all the data into a single period is incompatible with these new lengthened buffers - I can't simply take use the data from those streams to simply make up a longer buffer chiefly because the rack plugins pose a problem: they want a nice continuous stream of data, and we can't start start shifting their output data around to compose a new 'latency compensated' buffer. Anyway, the whole of the above has /one/ sole purpose: So that the magical time machine known as *Wave* tracks can accept these latency-aware lengthened buffers and store them properly, correctly timed, and perfectly aligned, in a wave file! Magical Wave tracks are the ONLY thing that can utilize these buffers - they can go back in time! All other tracks including Audio Outputs can't do that. Well, anyway there are challenges, so... I've decided to try another less desirable but much needed route for now, while I ponder some more, for the interim : Embellish the wave editor with some tools like 'Shift Event'. And draw the wave events highlighted (similar to multiple midi part editing), handling and selecting (multiple) wave events properly etc. This is far from perfect because a wave which was recorded as a mixture of a signals which came from varying latent sources will not have much use for a 'Shift' command, but at least it'll allow basic manual Audio Input latency compensation which is a good first step - what I'm really after ATM. ---------------- Awright, now onto something else. New drums: I see three serious deficiencies with the new drum editor: 1) No loadable drum maps. 2) Not multi-channel and multi-port. 3) Can't change A-note column. But I see one technical advantage (apart from the 'hiding' features): 1) They work per-track. The new drum tracks are more like other tracks with respect to the channel column in the track list, in that it's simply one channel per track and what you see is always what you get with that column. So the original drum tracks aren't going away soon. So I would like to rename "old style drum tracks" to "multi-channel drum tracks" and "new style drum tracks" to "multi-track drum tracks" or ... I dunno, suggestions? After all, new users don't know or care what 'old' and 'new' really mean... And please create an (slightly different) track icon for the new drums. What would really be better is to merge the best of both drum tracks. Because there's the potential for just too much confusion with two types. So where are we going with these new drum tracks? What's the plan? ---------------- LV2: Harmonized midi strip: Amen to that. Challenging for sure in both cases as we all know. ---------------- Remove the song type: As we discussed before, you should look at muse_evolution to see how it was removed. All the sysexes were moved into the instruments. Our instrument editor sysex editor needs to be completed first. I may try to do that for you. It was long-promised and overdue. But you know, it's been a while since I looked at this but I recall there were problems and questions even with evolution's approach. I remember questioning "how is this going to work?" I mean sysexes could also be stored in parts. But I guess it's pretty clear that's the way forward, somehow. ? Think carefully about this. Many places depend on that song type and it may not be so easy to remove. Be sure to investigate ALL usages. I think I saw some logistical/conceptual problems too. Anyway as I guess it starts with completing our instrument editor sysex editor, just like muse_evolution's editor. -------------- Prerecording feature: You kept talking about 'external synths' and I warned that audio inputs cannot be sampled while in freewheel mode. I verified this the other day: While in freewheel mode Jack reports there are *no* connections to an audio input and thus no data to gather, according to MusE and assuming we're not doing something wrong. I checked this right at the place where data is gathered from the Jack audio input port within our Audio Input getData() code. This is a bit of a shame and I was planning to ask Jack-devel about this. Because in my tests although I could not get Ardour to go into freewheel mode at the same time as MusE, QTractor did in fact do it. Turning off audio inputs makes perfect sense for physical terminals, as I say you can't 'sing into a mic' at the freewheel rate, however for other apps like QTractor (and indeed some of your 'external synths') this would be great to be able to sample the audio input during freewheel mode. Maybe there's some technical reason it can't be done - the audio from the other app still wouldn't arrive in time or something? ----------------- Logical->physical device mapping: Whew, a tall order, I hope you can pull it off. But with some of these goals, guys, please keep in mind existing song files. We must strive to provide conversion when an old song is opened. I suspect we may not always be able to accomplish that in places, time constraints programming challenges, the sake of simplicity, and all... And that's why I say this is looking more like 3.0 than 2.1 Very ambitious. October? Hmm... Cheers. Tim. ------------------------------------------------------------------------------ Live Security Virtual Conference Exclusive live event will cover all the ways today's security and threat landscape has changed and how IT managers can respond. Discussions will include endpoint security, mobile security and the latest in malware threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ _______________________________________________ Lmuse-developer mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/lmuse-developer
