Hi,
When I try to use csplit via stdout from zcat I get the message 'csplit:
memory exhausted'. The .gz file is a mysql dump from its databases (one
is particularly large):
mysqldump --opt --add-locks --add-drop-table -a -user=foo -p
--all-databases | gzip > ./mysql/backup.gz
When I unzip the file and then use csplit it works perfectly. However,
these files get rather big (8GB+) so I want to let them go through a
pipe. When I let the data go through a pipe it fails:
dav...@xinker:~/scripts/csplittest/20101109/mysql$ zcat backup.gz |
csplit -f db_ -n 4 - /--\ Current\ Database/ {*}
753
74459
75736
306159
121390
8215
10744920
4984
56142
6113058
788008
67549
513413
216776062
67525
csplit: memory exhausted
2081734963
dav...@xinker:~/scripts/csplittest/20101109/mysql$ ls -ltr
total 836324
-rw-r--r-- 1 davidh davidh 855553350 2010-11-09 01:35 backup.gz
dav...@xinker:~/scripts/csplittest/20101109/mysql$ zcat backup.gz >
/dev/null
dav...@xinker:~/scripts/csplittest/20101109/mysql$
Environments:
Ubuntu 8.04; Linux xinker 2.6.27-17-generic #1 SMP Fri Mar 12 03:09:00
UTC 2010 i686 GNU/Linux:
dav...@xinker:~/scripts/csplittest$ aptitude show coreutils
Package: coreutils
Version: 6.10-6ubuntu1
Debian Lenny: Linux test 2.6.26-1-686 #1 SMP Fri Mar 13 18:08:45 UTC
2009 i686 GNU/Linux;
test:~# aptitude show coreutils
Package: coreutils
Version: 6.10-6
To find the problem I wanted to create a core dump but... I can't seem
to be able to create a core dump (ulimit -c == unlimited) !? Am I
missing something simple here or is this a bug? Is zcat too hungry?
I would expect the failing command to be very memory friendly so I'm a
bit puzzled. Thanks in advance,
Met vriendelijke groet,
David Hofstee
Software Engineer Blinker B.V.