On Aug 11, 2009, at 7:07 PM, Tim Newsham wrote:
i didn't mean translating from one /dev/audio to the next.
i ment dealing with azalia audio vs. ac97 vs. soundblaster.
and ogg/vorbis vs. mp3 vs pem vs. *law.

I agree here.  I envision a separate codec server that
sits on top of an audio server and encapsulates a bunch
of this stuff.  It would be nice if it was practical
to  "cat foo.mp3 >/dev/codec/mp3" or something like that..
I haven't really thought this through much, just daydreaming
this feature...

About 3 years ago (at the time when I was really active in the
FFmpeg community) I toyed with an idea to port ffmpeg to
Plan9 in a form of a media FS. IOW:
     % ffmpeg matrix.vob
     % page /n/ffmpeg/matrix.vob/video/1/1

My biggest problem (and the one that reoccurred on this list
a month or so ago) was coming up with the equivalent of
HTTP's 'content negotiation' process.

There was also an additional complication with audio codecs --
they don't lend themselves easily to *sane* packetization.

But I'm getting ahead of myself: the biggest trouble with
audio (and this is something that linux folks have been
struggling with a lot, especially during the days of 486s)
is that it places almost RT requirements on processing.
You can do drop frames in video and only those with 20/20
will notice, with audio -- you drop a packet and you're
screwed quality-wise.

May be its better to call this latency, since we can all appreciate
some of the shortcomings that 9P has when it comes to it.

To be fair, though, I'm pretty sure you won't have much issues
with RAW PCM when the local kernel does the multiplexing,
but when you do it over the wire -- there'll be trouble.

Thanks,
Roman.

P.S. The ffmpeg media FS can be resurrected if there's enough
interest in this approach *and* if we can come up with the
interface before the coding phase.

Reply via email to