getline() returns an arbitrarily long line. In theory _megabytes_ long, which it mallocs. Could this be a denial of service attack? If you do "grep blah /dev/zero" and it produces endless amounts of input that never has a newline in it, it just keeps mallocing until it runs out of memory, thrashing memory as it copies existing data to a longer and longer buffer, evicting everything else (even if you don't have swap it'll thrash for a while evicting and then faulting back in executable pages from shared libraries and ELF executables)...
The thing is, I dunno what to _do_ about that. Arbitrary line length limits could screw up real world uses. (I think practically we've got about 2 gigabyte line limits because I tend to use ints to index stuff.) It's possible that various libc implementations will have their own internal limits... (Requiring input to be mmap()able isn't necessarily an improvement because then you can't operate on a pipe and "diff -u <(sort file1) <(sort file2)" is something I do reasonably often, _and_ because on 32 bit systems it limits the size of files it can deal with to the userspace virtual address range which is maybe 2 gigabytes.) Yeah, this is why the kernel has an out of memory killer, but if anybody has ideas for _not_ hitting that which don't impact potential legitimate uses, I'm open to suggestions... Rob _______________________________________________ Toybox mailing list [email protected] http://lists.landley.net/listinfo.cgi/toybox-landley.net
