* Daniel Stenberg <[email protected]> [2018-02-15 22:55]: > On Thu, 15 Feb 2018, Guido Berhoerster wrote: > >> That happens with the curl tool, but using the attached test case it'll read >> headers for up to 191K and pass them to the header function in 16K chunks: >> >> $ ./http-server.sh & >> [1] 14366 >> $ ./header-write-test > > Strange. It doesn't for me: > > $ sh http-server.sh & > [1] 19515 > $ ./header-write-test > GET / HTTP/1.1 > Host: 127.0.0.1:8000 > Accept: */* > > received 17 bytes > debugit: curl_easy_perform: Out of memory
Strange, indeed. I just downloaded and built 7.58.0 from source and tried again and I can still reproduce it, this is on Ubuntu 17.10/amd64. > BTW, I also created an issue out of the first bug: > https://github.com/curl/curl/issues/2314 and I submitted a PR for a new test > that verifies that curl detects "too long" HTTP headers: > https://github.com/curl/curl/pull/2315 OK, thanks for the quick response! -- Guido Berhoerster ------------------------------------------------------------------- Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library Etiquette: https://curl.haxx.se/mail/etiquette.html
