Hi Willy and William,

I ran some tests with the cache filter.

In http_action_store_cache(), the code indicates that only HTTP/1.1 is cached. This explains why I failed on my first tests with apachebench :) The protocol version is checked on the request side. Can't we rely on the response side instead ?

Btw, here are some first results for those who are interested :

* WITHOUT cache

- ab -n100000 -c100 http://localhost/image.gif
Time taken for tests:   59.127 seconds
Requests per second:    1691.27 [#/sec] (mean)
Time per request:       59.127 [ms] (mean)
Time per request:       0.591 [ms] (mean, across all concurrent requests)
Transfer rate:          1344.43 [Kbytes/sec] received

- h2load -n100000 -c100 https://localhost/image.gif
finished in 60.06s, 1664.91 req/s, 1.11MB/s

* WITH cache :

- ab -n100000 -c100 http://localhost/image.gif
Same results as before, but once patched to rely on the response, we get more interesting results :
Time taken for tests:   1.801 seconds
Requests per second:    55539.79 [#/sec] (mean)
Time per request:       1.801 [ms] (mean)
Time per request:       0.018 [ms] (mean, across all concurrent requests)
Transfer rate:          44149.79 [Kbytes/sec] received

- h2load -n100000 -c100 https://localhost/image.gif
finished in 1.49s, 67210.04 req/s, 44.80MB/s

for some details :
- image.gif = 510 bytes
- haproxy runs locally (the backend is in the same network) with this configuration :
    global
        nbthread 4
        tune.ssl.default-dh-param 2048
        log /dev/log local7 info err
        stats socket /tmp/proxy.socket level admin

    defaults
        timeout client 300s
        timeout server 300s
        timeout connect 5s
        timeout http-keep-alive 5s

    listen proxy
        mode http

        bind :80
        bind :443 ssl crt localhost.pem alpn h2,http/1.1

        option httplog

        http-response cache-store proxy
        http-request cache-use proxy

        server real backend:80

    cache proxy
        total-max-size 4


--
Cyril Bonté

Reply via email to