I deployed varnish on a webserver that also runs PHP to serve larger
files of up to about 1GB. NOTE, I am using the stock varnish shipped
with Debian 5, which is varnish version 1.1.2.

After running this setup for a while, I got some reports of constantly
interrupting downloads. After some poking around, I realised that
downloads of were being interrupted if they took more that 10 minutes.
Here's the output from a couple of tests with Varnish:

$ time wget -t 1 --header "Host: hostname"
http://server:6081/download.php?args=here
...
2009-10-09 12:31:35 (303 KB/s) - Connection closed at byte 186251297. Giving up.

real    10m9.089s
user    0m4.030s
sys     0m11.061s

$ time wget -t 1 --header "Host: hostname"
http://server:6081/download.php?args=here
...
2009-10-09 12:11:55 (314 KB/s) - Connection closed at byte 192727409. Giving up.

real    10m8.255s
user    0m3.296s
sys     0m10.937s

Testing the webserver directly, it works fine.

Is there a time limit like this in newer versions of varnish? Is there
a workaround I can use? It seems "pass"ing these requests doesn't
help, but "pipe"ing them appears to help. Is there a better solution
than hardcoding "pipe"s of all requests that could take more than 10
minutes?

By the way, I also noticed that even though varnish times out and cuts
the connection, varnishncsa logs the full size of the original file,
not the amount of data actually transferred.

Regards, Ketil
_______________________________________________
varnish-misc mailing list
[email protected]
http://projects.linpro.no/mailman/listinfo/varnish-misc

Reply via email to