[EMAIL PROTECTED],
I recently wrote a cgi perl script to use on this bands web site, it is
used to track the number of downloads an
mp3 get among other things. But the only way I could do this, was to
print the MP3 from the script. An MP3 is
a compressed audio file, which is about 3-4 megs for an average song.
The problem is, when I open the mp3
though the script and print it out though the script it takes /ALOT/ of
memory. Now tack on 70-100 downloads
at all times, and the entire system is bogged down, I have 96 phyiscal
ram and 700 Megs swap and it gets tight
when alot of people start downloading.
The point is, I /really/ need to trim this script down. See basically
the script has to copy the entire mp3 over
into swap space, which is slow, and takes alot of resources away from
the system. Here is (bloated) part of the script.
#!/usr/bin/perl
# Alot of counting, stats, sub routines here
## - Snip - #
if ( -e $file )
{ push_stats(); # does some stats, counting, etc.
open MP3, $file
or error(); # error is a sub routine
print "Content type: audio/mpeg\n\n";
while (<MP3>)
{ print $_; }
}
else
{ error(); }
See now if the mp3 is 4 megs X 70 downloads every 2 hours = ALOT of
memory wasted. I tried `cat`ing the file into
a $memory_location, but that did not work. Is there a way to "OPEN" a
file to print out, without reading the
/WHOLE/ thing into memory, just the first 10K or so?
Jack