On 22 Feb, Allegar Robert wrote:
> I've actually put a lot of thought into this problem. I have 400+ CDs, and
> am tired of using them. I want to encode my entire collection to MP3s. I
> figure I will need about 50 gigs or so, which I'm not too concerned with.
> The problem I've been having is actually writing the front end. The back end
> is simple enough -- you just create a directory structure with several
> top-level mounting points, and come up with a scheme that you stick to. For
> example, top-level -: Band Name -: Album name -: Disc name -:
> songNumber_SongTitle.mp3. This works fine for most bands. Some, like
> Orbital, have multiple albums with the same name, but the exceptions are
> few.

I've got a project called Obsequieum that is a networked streaming
jukebox. All the files reside on the server and there is a web
interface for choosing what files will get streamed next. The system
has complete scripts for ripping a CD into the system automatically
with virtually no human intervention. If you're interested in checking
it out in detail and seeing a demo user interface check out:


> The real heart of the matter is pure bandwidth. I've actually tried setting
> up a solution that involved a web download -- which, not suprisingly, was
> already mentioned -- but it still takes on the order of 5-20 seconds to
> "download" the song from the file server to the local server. This is
> unacceptable to me.

Obsequieum uses a play queue to decide what to play next, and a number
(user configurable) of tracks are 'locked' so the system has a chance
to cache them locally before streaming them out. It would be easy to
write a new device handler that downloads files from whereever you'd
like it to. Shouldn't be much work.

--ruaok         Freezerburn! All else is only icing. -- Soul Coughing

Robert Kaye -- [EMAIL PROTECTED]  http://moon.eorbit.net/~robert


Reply via email to