Re: Processing a big file using more CPUs

2019-02-12 Thread Shlomi Fish
On Mon, 11 Feb 2019 23:54:43 +0100 Ole Tange wrote: > On Mon, Feb 4, 2019 at 10:19 PM Nio Wiklund wrote: > : > >cat bigfile | parallel --pipe --recend '' -k gzip -9 > bigfile.gz > : > > The reason why I want this is that I often create compressed images of > > the content of a drive,

Re: Processing a big file using more CPUs

2019-02-11 Thread Nio Wiklund
Den 2019-02-11 kl. 23:54, skrev Ole Tange: On Mon, Feb 4, 2019 at 10:19 PM Nio Wiklund wrote: : cat bigfile | parallel --pipe --recend '' -k gzip -9 > bigfile.gz : The reason why I want this is that I often create compressed images of the content of a drive, /dev/sdx, and I lose

Re: Processing a big file using more CPUs

2019-02-11 Thread Ole Tange
On Mon, Feb 4, 2019 at 10:19 PM Nio Wiklund wrote: : >cat bigfile | parallel --pipe --recend '' -k gzip -9 > bigfile.gz : > The reason why I want this is that I often create compressed images of > the content of a drive, /dev/sdx, and I lose approximately half the > compression improvement

Processing a big file using more CPUs

2019-02-04 Thread Nio Wiklund
Hi parallel users, Background EXAMPLE: Processing a big file using more CPUs To process a big file or some output you can use --pipe to split up the data into blocks and pipe the blocks into the processing program. If the program is gzip -9 you can do: cat bigfile | parallel --pipe