looks like something what I'm looking for! thanks a lot, starting my tests br L.
Łukasz Tasz RTKW 2016-09-01 15:31 GMT+02:00 Maxim Dounin <mdou...@mdounin.ru>: > Hello! > > On Thu, Sep 01, 2016 at 01:34:39PM +0200, Łukasz Tasz wrote: > > > Hi all, > > since some time I'm using nginx as reverse proxy with caching for serving > > images files. > > looks pretty good since proxy is located per each location. > > > > but I noticed problematic behaviour, when cache is empty, and there will > > pop-up a lot of requests at the same time, nginx don't understand that > all > > request are same, and will fetch from upstream only onece and serve it to > > the rest, but all requests are handovered to upstream. > > side effects? > > - upstream server limit rate since there is to much connections to one > > client, > > - in some cases there are issues with temp - not enough space to finish > all > > requests > > > > any ideas? > > is it known problem? > > > > I know that problem can be solved with warming up caches, but since there > > is a lot of locations, I would like to keep it transparent. > > There is the proxy_cache_lock directive to address such use cases, > see http://nginx.org/r/proxy_cache_lock. > > Additionally, for updating cache items there is > "proxy_cache_use_stale updating", see > http://nginx.org/r/proxy_cache_use_stale. > > -- > Maxim Dounin > http://nginx.org/ > > _______________________________________________ > nginx mailing list > nginx@nginx.org > http://mailman.nginx.org/mailman/listinfo/nginx
_______________________________________________ nginx mailing list nginx@nginx.org http://mailman.nginx.org/mailman/listinfo/nginx