Hi,

I'm working for a company named SmartJog (http://www.smartjog.com). We currently are trying to build a file caching solution for HTTP servers streaming video and audio content. After having benchmarked several caching solutions we found out Varnish was the best performance-wise. Unfortunately, there seem to lack several features we would need and we ran into several issues.

Currently, our main problem is that while using a Varnish server in "streaming mode" and when a client connects to this server and asks for a file not cached yet, Varnish starts retrieving the file and sending it to the client while filling its cache. But if meanwhile a second client connects and ask for the same file, Varnish waits for having finished retrieving the whole file to start responding to this second client. Our plan at first is to patch this with the following behavior: if the second client asks for a file that is currently being cached but its caching isn't finished, we retrieve the file from the origin server or from the current cache if enough is filled and send it to the second client except this time we mark it as non-cacheable. Would you be interested by this patch ? If so, are we at least doing it the good way ?

Secondly, as we are going to need to do more developments on the streaming part of Varnish, would you be interested in SmartJog contributing to Varnish for this ?

Thanks in advance for your answers.

Regards,

--
Thomas SOUVIGNET, R&E Engineer

SmartJog SAS - http://www.smartjog.com - A TDF Group Company
Office: 27, blvd Hippolyte Marques 94200 Ivry-sur-Seine - France EU
Phone: +33 (0)1 5868 6207


_______________________________________________
varnish-dev mailing list
[email protected]
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-dev

Reply via email to