Thank you for help.
What you said is all correct.

I think I'm not clear. I want exactly is that:
2 varnishes share only one cache file(or cache memory). As one request, 2
varnishes share only one cache object.
Something like this:
varnish1--> cache file <--varnish2

Could you have any suggestions or experiences?
Everything will be appreciated.

Regard

Shawn



2013/2/20 Sascha Ottolski <[email protected]>

> Am Dienstag, 19. Februar 2013, 19:44:53 schrieb Xianzhe Wang:
> > Here I take nginx as a load balancer and it contects 2 varnish severs.
> > I wanna share the cache object between 2 varnishes.
> > When one varnish is down, the left one will work fine, and the cache
> > object is still work.
> > Is there anything I can do for this?
> >
> > I aslo saw an  example something like this:
> > https://www.varnish-cache.org/trac/wiki/VCLExampleHashIgnoreBusy
> > But i think it will increase network delay. So I don't want do it like
> > this.
> >
> > Is someone can share their experience? Thanks a lot.
> >
> > Shawn
>
> I would say, you already have your solution. If nginx send the requests
> randomly to any of the two servers, each will obviously fill its cache;
> so if one goes down, the other is still there. The two caches may not be
> completely identically, depending on the size of your cacheable content,
> but each should be "warm" enough to serve most requests from its cache.
>
> And you're not limited to two varnish servers, of course. The more you
> put into your loadbalanced cluster, the lower the impact if one fails.
>
> Cheers
>
> Sascha
>
_______________________________________________
varnish-misc mailing list
[email protected]
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc

Reply via email to