On Tue, Dec 6, 2011 at 9:23 PM, Glenn Fowler <[email protected]> wrote:
>
> thanks for the feedback on the topic of sfgetr() line limits
>
> its was a few years ago when we saw bad behavior on one of our systems
> at this point I don't recall which one it was
>
> unix already has mechanisms to handle resource hogs:
> ulimit(2), setrlimit(2) and the ulimit(1) shell builtin
>
> if we remove the line limit then sfgetr() on a big enough line
> will get an sbrk() or mmap() error when trying to grow the line buffer
> and that error will make its way to a diagnostic at the command level
>
> ulimit -M (unlimited by default) can be used to limit the
> memory allocations in a process, including memory used by sfgetr()

Has this limit be removed in the meantime? The days of big data are
here (1PB now "common") and such limits will cause trouble. Sooner.
Than. You. Think. (I'm now myself cursed with hunting a problem with
read -C which started to choke after a dataset grew beyond 64GB. Still
haven't figured out why this happens)

Irek
_______________________________________________
ast-users mailing list
[email protected]
https://mailman.research.att.com/mailman/listinfo/ast-users

Reply via email to