El miércoles, 19 de abril de 2017, 9:28:12 (UTC-3), Niphlod escribió:
>
> It'd be easier to see if it works with a slightly modified directive for 
> uwsgi_cache_valid
>
> uwsgi_cache_valid any      1m
>
>
> Because if you miss the 3 parameter notation, only 200 responses are 
> cached.
>

Yes, I forgot to mention that I had try that without success.
 

Then you can try returning a proper X-Accel-Expires that should "trump" any 
> other header.
>
> uwsgi_cache is an application cache (read: you could use it instead of 
> redis if your whole app is deployed through that nginx process)
>
> If you're not looking for an IPC cache that is uwsgi_cache (that's 
> basically what cache.ram does for single processes), maybe it'd be better 
> to use the upstream proxy cache via proxy_cache and proxy_cache_path 
> directive.
>
> BTW:  $upstream_cache_status that you use shouldn't be used to see if 
> uwsgi_cache is used at all. It's the "flag" that the upstream cache is 
> working (proxy_cache directive) 
>


Right now I am using Redis to cache html responses. As I'm using several 
uwsgi workers, with Redis, the same cache is shared along all the processes 
(correct me if I'm wrong, but I think that's correct). 
Actually, I remember that previously I tried using cache.ram, but logically 
it didn't work, or better said, each process had its own cache, they 
couldn't share it, that's why I moved to Redis.

When you say "... an IPC cache that is uwsgi_cache...", do you mean that 
uwsgi_cache wouldn't be suitable to do caching along processes? I got lost 
in the translations there. 
Anyway, I must admit that I'm way beyond my limits with this, I'm having a 
hard time to understand all this, I think I'll have to hire someone that 
knows better than me :)
I've been looking for a person to work with, my first goal was to find 
someone here in my city, but I'm in a small city and there is no one here 
working with these technologies.

I don't want to bother here with things that aren't related to web2py.
Thank you very much for your time!! I'll keep digging.

Best regards,
Lisandro.

 

> On Monday, April 17, 2017 at 2:15:21 PM UTC+2, Lisandro wrote:
>>
>> I've been dealing with this problem for some time now.
>> Some weeks ago I posted a question in stackoverflow, but I didn't find a 
>> solution yet:
>>
>> http://stackoverflow.com/questions/43052276/nginx-why-isnt-uwsgi-cache-working-with-this-headers
>>
>> Also I've been working with a sysop who knows better than me, but still 
>> couldn't solve the problem. 
>>
>> I don't think web2py has anything to do with it, so I'll mark this thread 
>> as "no action required".
>> Still, any comment or suggestion will be appreciated.
>>
>> Best regards,
>> Lisandro
>>
>> El viernes, 24 de marzo de 2017, 10:00:06 (UTC-3), Lisandro escribió:
>>>
>>> I'm running a web2py website with public articles, and there are 
>>> ocasional peaks in traffic. 
>>> I use Nginx as a webserver, and uWsgi to run my web2py application.
>>>
>>> Considering the articles are public (the HTML page of an article is the 
>>> same for every visitor), I'm already doing some caching in order to improve 
>>> performance (I'm using @cache.action decorator with Redis model).
>>> However, and please correct me if I'm wrong, for every request made to 
>>> the URL of an article, before the caching can be done, the models need to 
>>> be executed.
>>> So I thought I could improve performance even more, caching the HTML 
>>> directly from Nginx, that way I would save resources in my server.
>>>
>>> However I'm having a hard time getting it, and I wanted to know if I 
>>> should modify the response.headers.
>>> I've read that they come set by default:
>>> http://web2py.com/books/default/chapter/29/04/the-core#response
>>>
>>> To do some tests, I have this simple web2py function:
>>>
>>> def test():
>>>     from datetime import datetime
>>>
>>>     return datetime.now().strftime('%H:%M:%S')
>>>
>>>
>>>
>>> In the ngin'x side, the server block configuration is this:
>>>
>>> uwsgi_cache_path /tmp/nginx_cache/ levels=1:2 keys_zone=mycache:10m 
>>> max_size=10g inactive=10m use_temp_path=off;
>>>
>>> server {
>>>     ...
>>>
>>>     location / {
>>>         # response header to check if cache is a HIT or a MISS
>>>         add_header              X-uWSGI-Cache $upstream_cache_status;
>>>
>>>         # server cache
>>>         uwsgi_cache  mycache;
>>>         uwsgi_cache_valid  15m;
>>>         uwsgi_cache_key  $request_uri;
>>>
>>>         # client cache
>>>         expires 3m;
>>>
>>>         uwsgi_pass      unix:///tmp/web2py.socket;
>>>         include         uwsgi_params;
>>>         uwsgi_param     UWSGI_SCHEME $scheme;
>>>     }
>>> }
>>>
>>>
>>> But every time I hit the test page, I check the response headers and I 
>>> see always a MISS.
>>> In other words, nginx still sends the requests to uwsgi, and the page is 
>>> generated in every request.
>>> I've found this forum post where someone says this:
>>>
>>> *"...it looks to me like the issue is that the upstream server is just 
>>> not sending response that contain an expiration date (Expires:) or a cache 
>>> validator (for instance, Last-Modified:). (The cookie expiration time has 
>>> nothing to do with caching.)*
>>> *The HTTP 1.1 spec 
>>> <http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.4> says: 'If 
>>> there is neither a cache validator nor an explicit expiration time 
>>> associated with a response, we do not expect it to be cached, but certain 
>>> caches MAY violate this expectation (for example, when little or no network 
>>> connectivity is available).'"*
>>>
>>>
>>> So I thought I would still needed to use the @cache.action decorator 
>>> (with model=None in order to only set response headers to allow client 
>>> caching):
>>>
>>> @cache.action(time_expire=222, cache_model=None, session=False, vars=
>>> False, public=True)
>>> def test():
>>>     from datetime import datetime
>>>
>>>     return datetime.now().strftime('%H:%M:%S')
>>>
>>>
>>> However I sill can't get it to work.
>>> I set up time_expire=222 to check if the directive "expires 3m;" in 
>>> nginx's configuration would overwrite it, and yes it does, the responses 
>>> have a Cache-Control: max-age=180 (that is 3 minutes, not 222 seconds).
>>>
>>> *I don't intend to talk about nginx's configuration variables, but I'm 
>>> tempted to ask: am I missing something on the web2py's side?* 
>>> Do I need to modify response.headers in another way to let nginx cache 
>>> the response from uwsgi?
>>>
>>>
>>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to