> > > >Chris Double wrote: > > >... > > >> Rendering MIDI data into audio (via Quicktime is the only current > > >> solution there I suppose?) > > Can you say what you mean here? Is there a simple way to call Quicktime > from JavaScript to, eg, play F sharp in an oboe timbre for 0.8 seconds? >
Not sure what Chris meant, but you could generate the contents of a midi file in memory, then feed those bytes (after uuencoding<http://en.wikipedia.org/wiki/Uuencoding>) to your platform's midi player (w/ the embed tag). Here's an example: http://tinlizzie.org/ometa-js/#Etude I doubt this approach would be useful for real-time stuff like making sounds come out of your keyboard morph, but it's definitely worth knowing about. Cheers, Alex > > > You can use HTML 5 audio via data URL's to generate sound and play it - > > > although it's a bit of a pain doing it this way. > > Do you mean one data URL per note (88 of them), with no control over, eg, > duration, pitch, etc? Or is there something better? > > >There is work being > > > done on working out an HTML audio generation API. See here for some > >> discussion: > >> > >> https://bugzilla.mozilla.org/show_bug.cgi?id=490705 > >> > >> If you generate the audio on the server and serve it as a WAV file then > >> you can use HTML 5 audio to play it on recent Chrome, Safari, Opera and > > > Firefox builds (I think all those support WAV). > > Can you say a bit more? Can one serve up (or otherwise obtain) one .WAV > file > and then use that as a timbre by ASDR techniques so you can play all 88 > notes in various durations and volumes from that one file? > > Thanks in advance > > - Dan > _______________________________________________ > lively-kernel mailing list > [email protected] > http://lists.hpi.uni-potsdam.de/listinfo/lively-kernel >
_______________________________________________ lively-kernel mailing list [email protected] http://lists.hpi.uni-potsdam.de/listinfo/lively-kernel
