I'm not understanding you. It sounds like you are saying  that you used some
version of tar which was not able to tar/untar a .tgz file??  this is not the
case here.

I have tested (solaris) tar to make sure that the problem isnt with it.  to
reiterate, nightly we run solaris tar on a branch of the filesystem being backed
up, then we use /bin/compress to compress the .tar file.
I have sucessfully uncompressed and untarred this tarball, to make sure it isnt
getting corrupted.

Actually, interestingly enough, when I used gnutar to tar/untar this tarball last
night, it untarred sucessfully.  Perhaps there is a problem only because our
amanda backup uses particular gnutar options (like incremental??)

Chris Dahn wrote:

> On Friday 26 October 2001 02:17 am, sleonard wrote:
> > we are having a problem with large( +2 gig) .Z compressed tarballs
> > (made w/ solaris compress utility) getting truncated/corrupted when
> > they are backed up w/ amanda (using gnutar).
> >
> > Conveniently found this out when we attempted a restore.
> >
> > What happens is that if one of these large compressed files is
> > encountered by the amanda backup, it silently stops backing up anything
> > on that filesystem at that
> > point, and if the file is restored, it is a truncated/corrupted
> > version of the original file.  gnutar has no problem backing
> > up/restoring larger (3.9 g) tarballs which are not compressed.
> >
> > we will attempt a workaround by using a new version of gzip, but I am
> > curious as to whether this has been seen before, and whether it is a
> > gnutar bug, and if so , where I shoud report it.  I am currently
> > running a test of gnutar w/o amanda on this file, but if it works
> > properly I am left scratching my head as to whether the problem is with
> > amanda or gnutar.
>
>   On a whim, I checked out a cd that I had made a while back of a user's home
> directory.  In the directory, it contained a 98MB tar gzip file among other
> things. I looked at the tar gzip file I made of the directory, and everything
> was fine, until it got to this 98MB tar gzip file.  At that point, apparently
> tar had stopped reading from the directory and closed off the stream and
> stopped writing to the file. I don't believe it gave an error message since I
> didn't even know this had happened.  Therefore, it looks like this is a tar
> problem, not an amanda problem.
>
> --
>
> <->Software Engineering Research Group<->
> Feel the SERG!
> http://serg.mcs.drexel.edu/
> CAT 186, The Microwave
> http://pgp.mit.edu:11371/pks/lookup?search=Christopher+Dahn&op=index

--
----
MHO
---
shanna leonard
arizona health sciences library
626-2923
----------------------------------


Reply via email to