Just changed my workspace_client from 4k to 256k. The problem immediately went away.
I might have to tune this value to find the smallest possible value I could use. Thanks! On Fri, Oct 16, 2015 at 10:24 AM, Hugues Alary <[email protected]> wrote: > Hi Geoff, > > It was indeed your thread. > > I've been tracking so many bugs recently that I forgot that you had > mentioned the problem was coming from your configuration and not varnish. > > I'm glad you saw my email on the list and answered -I'm just going to > increase my workspace_client. > > I'll report back if that solved my issue. > > Thanks for the help! > -Hugues > > > > > > > On Thu, Oct 15, 2015 at 11:26 PM, Geoff Simmons <[email protected]> wrote: > >> -----BEGIN PGP SIGNED MESSAGE----- >> Hash: SHA256 >> >> On 10/14/2015 10:57 PM, Hugues Alary wrote: >> > >> > - Error out of workspace - LostHeader >> > X-Forwarded-For: - ReqUnset X-Forwarded-For: 80.68.74.90 - >> > ReqUnset X-Forwarded-For: 162.158.92.219 - Error >> > out of workspace >> > >> > It seems that varnish is running out of workspace. Some googling >> > tells me it could be related to ESI includes. It so happen that I >> > did change something recently with ESI includes, and, I happen to >> > have pages loading something like 100 ESI includes (maybe even >> > more). >> > >> > I read that someone else using 4.0.3 was running out of workspace, >> > despite setting a value up to 16mb. Maybe I should tune my >> > workspace, but the bug report stated that raising the value to 16mb >> > ended up eating up all the memory of the system and thus wasn't >> > really an option. >> >> It sounds like you were looking at my thread from a while back. As it >> turned out, the problem was due to our own error in VCL, as I'll >> explain below, and unless you're making the same mistake, you're >> probably not having the same problem. >> >> At any rate, requests and all of their ESI subrequests use the same >> workspace, so if you really have up to 100 ESIs, your workspace might >> simply not be large enough, so you really should try increasing >> workspace_client. >> >> Our mistake was that, under certain error conditions, a request would >> restart to another URL that shows a custom error message, which also >> has ESI includes. The ESIs within the restart would encounter the same >> error, then restart to the same URL, and so on into endless recursion. >> max_esi_depth didn't stop the madness soon enough, because the ESI >> tree was expanding in breadth as well as depth. >> >> Getting VCL to notice this was happening (by checking req.esi_level) >> solved the problem, and we could set the workspace sizes back down to >> their previous values. >> >> If you don't have something crazy going on like that, then as I said, >> you might just have the straightforward problem that your workspaces >> are too small for the 100 ESIs. >> >> >> HTH, >> Geoff >> - -- >> ** * * UPLEX - Nils Goroll Systemoptimierung >> >> Scheffelstraße 32 >> 22301 Hamburg >> >> Tel +49 40 2880 5731 >> Mob +49 176 636 90917 >> Fax +49 40 42949753 >> >> http://uplex.de >> -----BEGIN PGP SIGNATURE----- >> Version: GnuPG v1 >> >> iQIcBAEBCAAGBQJWIJh/AAoJEOUwvh9pJNURbOsP/0m4+E0eAWQJNkzWApMWFb0y >> 5T6Uat/9GmHP98DJpYVvi2e7gcIDC40PHc/EG0gM9CJ2Leatb49ujBm8MQZf47pO >> MqxD8RziaWTKOtfK34TPQ01PSvmhP+IDhUTAU+GPmUn8ZeS7ABVpuAXJ5+lwnZZZ >> bgFN6oyeTkwPEeCMb2sDcP/SnAoTZwCstEZgQ+JD8Vubz7iMzB/wDJ1jeOwwXpIw >> Mn7sZMe3H/3RiqyFu534IHqRvvjP/hGuwKDBiUXJVCHesT7enBOLyOMZB8eIRx/U >> kGuZst9Ecx2pJz0lE3u+OM5v6796nSCP4Vs9AipNF0xN8PUISWoEKJqcX85pmklG >> Fsygtc9GkEqTMsJsPeus7381z0gbkqnzyVhekdT93JjP1l3XigneZwbU9hPX3bxe >> EmCru+/56As4TYGLL+IDcNvToqIfPbf8L3U4xl6XZeCA5NhSQkZRXwbY1uPCAosK >> nrPQTTQZqi6Q22jYtiUfJ0kwua0xhJIYGkPJbD2HMgiS2QczrN50Q7py4J5O6xzr >> 7TTvQcJ+kizq3fKKBt9tCo56LYR2Cocrv9zxpFlQp7zVjzEaj/5pqYc8KQC2LHvm >> xdJ/g08Rf97A/va+uxx9TWsFcUBHMab0+3XRfYum4mD9ijkhxc46T5cfdD6cOwlF >> GC0U2fS3vYRy+94iqls5 >> =tRLF >> -----END PGP SIGNATURE----- >> >> _______________________________________________ >> varnish-misc mailing list >> [email protected] >> https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc >> > >
_______________________________________________ varnish-misc mailing list [email protected] https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc
