Configuration Information [Automatically generated, do not change]: Machine: x86_64 OS: linux-gnu Compiler: gcc Compilation CFLAGS: -DPROGRAM='bash' -DCONF_HOSTTYPE='x86_64' -DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='x86_64-redhat-linux-gnu' -DCONF_VENDOR='redhat' -DLOCALEDIR='/usr/share/locale' -DPACKAGE='bash' -DSHELL -DHAVE_CONFIG_H -I. -I. -I./include -I./lib -D_GNU_SOURCE -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic uname output: Linux pmpc983.npm.ac.uk 2.6.24.4-64.fc8 #1 SMP Sat Mar 29 09:15:49 EDT 2008 x86_64 x86_64 x86_64 GNU/Linux Machine Type: x86_64-redhat-linux-gnu
Bash Version: 3.2 Patch Level: 33 Release Status: release Description: Using echo `cat ...` on a large binary file causes lots of memory to be used (fine), but if you ctrl-c while it's running it doesn't die properly and doesn't return used memory when finished. Originally found by screwing up a sed command (can also reproduce bug using sed rather than cat) while trying to rename a group of files. Repeat-By: Every time 1. Find large binary data file for test (mine is ~3.2GB) 2. echo `cat filename` 3. Ctrl-C previous command while running (doesn't terminate) 4. When step 2 eventually returns it does not release memory