On Friday 12 August 2005 11:47, Shlomi Fish wrote:
> > However, it should be noted the problem still persists:
> > curl -v -o /dev/null -r 150-200
> > http://www.iglu.org.il/pub/mirrors/fedora/4/x86_64/iso/FC4-x86_64-DVD.iso
> >
> > Anyone brave enough can try upgrading Apache to 2.x or perhaps use a
> > different web server for /pub (and forward it through mod_proxy or
> > something)?
>

After trying out thttpd, boa and perl's HTTP::Daemon, I found a web-server 
that handled the big file out of the box: cherokee:

http://www.0x50.org/
http://freshmeat.net/projects/cherokee/

It passed all the tests of fully displaying a big file in the directory 
listing (with proper size), downloading it properly using wget, and fetching 
a part of it using the Range: header. It also supports embedded PHP.

The downside of it is that its package is not present in Sarge. It is 
available in Debian testing (Etch) and unstable, though.

What do we do now? Should I keep on looking? My log  in the search for a 
suitable web server is included below.

Regards,

        Shlomi Fish

<<<<<<<<<<<
thttpd - does not support large files by default. With the 
make CCOPT="-D_..." hack, it eventually consumes a lot of memory.

boa - does not support it, and does not seem to have a configuration option 
to.

Perl's HTTP::Daemon - 
    1. Gets stuck at the end of the downloaded file (if it's big.
    2. Does not support the Range: HTTP header.

cherokee - passes both tests:
    http://www.0x50.org/
    http://freshmeat.net/projects/cherokee/
>>>>>>>>>>>

---------------------------------------------------------------------
Shlomi Fish      [EMAIL PROTECTED]
Homepage:        http://www.shlomifish.org/

Tcl is LISP on drugs. Using strings instead of S-expressions for closures
is Evil with one of those gigantic E's you can find at the beginning of 
paragraphs.

Reply via email to