--- Paul Eggert <[EMAIL PROTECTED]> wrote:

> Andrew Klaassen <[EMAIL PROTECTED]> writes:
> 
> > I've run into a problem running 'tar cvvWf' if
> > archive members are larger than 2GB.
> 
> Does the problem occur if any members are larger
> than 2GB, or only in some cases?

It seems to be every time - at least until the test
case you sent below.

> I couldn't reproduce the problem with a simple test
> case, on Debian stable with tar 1.16.1 that I
> compiled.  Here's what I did.  What happens when
> you try the same thing with your 'tar'?
> 
> $ echo foo | dd bs=1 seek=2G of=big
> 4+0 records in
> 4+0 records out
> 4 bytes (4 B) copied, 0.00105214 s, 3.8 kB/s
> $ ls -l big
> -rw-r--r-- 1 eggert eggert 2147483652 May  8 09:42
> big
> $ tar cvvWf tar big
> -rw-r--r-- eggert/eggert 2147483652 2007-05-08 09:42
> big
> Verify -rw-r--r-- eggert/eggert 2147483652
> 2007-05-08 09:42 big
> $ rm big tar

I get a similarly flawless result with that test.

I've been working all day to find a reproducible way
to trigger the problem without having to send you a
multi-gigabyte file.  I've found a couple of things:

1. The problem seems to be with 4GB+ files, not 2GB+
files, as I thought earlier.

2. It's less than optimal, but here's a way to
reproduce the problem consistently:

# dd if=/dev/urandom of=random.bigfile bs=1k count=5M
# tar cvvWf test.tar random.bigfile

That should trigger it.

Andrew




 
____________________________________________________________________________________
We won't tell. Get more on shows you hate to love 
(and love to hate): Yahoo! TV's Guilty Pleasures list.
http://tv.yahoo.com/collections/265 


Reply via email to