It is fairly simple to hack nginx and use Lua to reload the cache timed or
via a request.
The code is already there, its just a matter of calling it again.
Posted at Nginx Forum:
https://forum.nginx.org/read.php?2,281179,281225#msg-281225
___
nginx
One more approach is to not change the contents of resources without also
changing their name. One example would be the cache_key feature in Rails, where
resources have a path based on some ID and their updated_at value. Whenever you
modify a resource it automatically expires.
Sent from my
> How does one ensure cache consistency on all edges?
I wouldn't - you can never really rely on anything being consistent cached,
there will always be stuff that doesn't follow the standards and thus can give
an inconsistent state for one or more users.
What I'd do, would simply to be to purge
Hi Lucas,
Thank you for this. GEM all over. I didn’t know curl had –resolve.
This is a more a generic question: How does one ensure cache consistency on
all edges? Do people resort to a combination of expiry + background update
+ stale responding? What if one edge and the origin was updated
Hello!
On Wed, Sep 12, 2018 at 12:41:15PM -0700, Quintin Par wrote:
> Not to sounds demanding, but do you have any examples (code) of proxy_store
> bring used as a CDN. What’s most important to me in the initial cache
> warming. I should be able to start a new machine with 30 GB of cache vs. a
>
> The cache is pretty big and I want to limit unnecessary requests if I can.
30gb of cache and ~ 400k hits isn’t a lot.
> Cloudflare is in front of my machines and I pay for load balancing, firewall,
> Argo among others. So there is a cost per request.
Doesn’t matter if you pay for load
Hi Peter,
Here are my stats for this week: https://imgur.com/a/JloZ37h . The Bypass
is only because I was experimenting with some cache warmer scripts. This is
primarily a static website.
Here’s my URL hit distribution: https://imgur.com/a/DRJUjPc
If three people are making the same request,
Quintin,
Are most of your requests for dynamic or static content?
Are the requests clustered such that there is a lot of requests for a few
(between 5 and 200, say) URLs?
If three different people make same request do they get personalized or
identical content returned?
How long are the cached
Hi Lucas,
The cache is pretty big and I want to limit unnecessary requests if I can.
Cloudflare is in front of my machines and I pay for load balancing,
firewall, Argo among others. So there is a cost per request.
Admittedly I have a not so complex cache architecture. i.e. all cache
machines
Can I ask, why do you need to start with a warm cache directly? Sure it will
lower the requests to the origin, but you could implement a secondary caching
layer if you wanted to (using nginx), so you’d have your primary cache in let’s
say 10 locations, let's say spread across 3 continents (US,
Hi Maxim,
Thank you for this. Opened my eyes.
Not to sounds demanding, but do you have any examples (code) of proxy_store
bring used as a CDN. What’s most important to me in the initial cache
warming. I should be able to start a new machine with 30 GB of cache vs. a
cold start.
Thanks
Hello!
On Tue, Sep 11, 2018 at 04:45:42PM -0700, Quintin Par wrote:
> I run a mini CDN for a static site by having Nginx cache machines (in
> different locations) in front of the origin and load balanced by Cloudflare.
>
> Periodically I run rsync pull to update the cache on each of these
>
I run a mini CDN for a static site by having Nginx cache machines (in
different locations) in front of the origin and load balanced by Cloudflare.
Periodically I run rsync pull to update the cache on each of these
machines. Works well, except that I realized I need to restart Nginx and
reload
13 matches
Mail list logo