Re: squid process dies when it reaches a size of 1GB.

2006-07-19 Thread Janne Johansson
Joe Gibbens wrote: Thanks for the reply Janne. So my only way to run a process over 1GB in size is a custom kernel? Is Yes, as of now, on i386. there an easier way to run a large cache with a process size over 1GB? You can do other things aswell, like bumping cachepct to ~12 with

Re: Process dies when it reaches a size of 1GB.

2006-07-18 Thread Janne Johansson
Joe Gibbens wrote: I'm running squid-transparent on 3.9, and the process dies every time it reaches 1GB. FATAL: xcalloc: Unable to allocate 1 blocks of 4108 bytes! The system has 2GB ram # ulimit -aH time(cpu-seconds)unlimited file(blocks) unlimited coredump(blocks) unlimited

Re: squid process dies when it reaches a size of 1GB.

2006-07-18 Thread Joe Gibbens
Thanks for the reply Janne. So my only way to run a process over 1GB in size is a custom kernel? Is there an easier way to run a large cache with a process size over 1GB? I can re-configure the memory usage, but it would be nice to be able to utilize more of my physical memory without having to

Process dies when it reaches a size of 1GB.

2006-07-17 Thread Joe Gibbens
I'm running squid-transparent on 3.9, and the process dies every time it reaches 1GB. FATAL: xcalloc: Unable to allocate 1 blocks of 4108 bytes! The system has 2GB ram # ulimit -aH time(cpu-seconds)unlimited file(blocks) unlimited coredump(blocks) unlimited data(kbytes)