Can I ask, why do you need to start with a warm cache directly? Sure it will
lower the requests to the origin, but you could implement a secondary caching
layer if you wanted to (using nginx), so you’d have your primary cache in let’s
say 10 locations, let's say spread across 3 continents (US,
Hi Maxim,
Thank you for this. Opened my eyes.
Not to sounds demanding, but do you have any examples (code) of proxy_store
bring used as a CDN. What’s most important to me in the initial cache
warming. I should be able to start a new machine with 30 GB of cache vs. a
cold start.
Thanks
Hi Lucas,
The cache is pretty big and I want to limit unnecessary requests if I can.
Cloudflare is in front of my machines and I pay for load balancing,
firewall, Argo among others. So there is a cost per request.
Admittedly I have a not so complex cache architecture. i.e. all cache
machines
Quintin,
Are most of your requests for dynamic or static content?
Are the requests clustered such that there is a lot of requests for a few
(between 5 and 200, say) URLs?
If three different people make same request do they get personalized or
identical content returned?
How long are the cached
Hello!
On Tue, Sep 11, 2018 at 04:45:42PM -0700, Quintin Par wrote:
> I run a mini CDN for a static site by having Nginx cache machines (in
> different locations) in front of the origin and load balanced by Cloudflare.
>
> Periodically I run rsync pull to update the cache on each of these
>
Hello!
On Tue, Sep 11, 2018 at 12:28:52PM -0700, Dk Jack wrote:
> On your last suggestion...
>
> "In the request body filter you have to either return a fatal error, or
> raise some internal flag for your processing, and then
> act on this flag after the body is fully read."
>
> How can I be
Hello!
On Tue, Sep 11, 2018 at 10:00:05PM +0300, Gena Makhomed wrote:
> On 11.09.2018 18:59, Maxim Dounin wrote:
>
> > Лучше всего - сделать так, чтобы OpenSSL научился проверять
> > OCSP-ответы не полной цепочкой сертификатов вплоть до доверенного
> > root'а, а ровно так, как и