Using gnu tar.  This happens both if I pipe the output of 
BackupPC_createTar directly to tar and if I untar from the file.  More 
specifically, the tar command I am using is "tar -xf - -C MYDIRECTORY".  
No compression and the archive is staying on the same server.

Stephen

On 08/11/2010 12:33 PM, John Rouillard wrote:
> On Wed, Aug 11, 2010 at 10:23:59AM -0500, Stephen Gelman wrote:
>> I am running BackupPC 3.1.0 on Nexenta.  It seems to be working for the
>> most part.  I am having a problem with BackupPC_tarCreate.  I am trying
>> to create a tar of a 30gb backup.  The tar I create ends up being 30gb,
>> but when extracted it only takes up 5gb and is missing a lot of files.
>> I can restore the missing files using the web interface, so I know that
>> they are being backed up and that BackupPC has permission to access
>> them.  Does anyone have any idea what's going on?  The only clue I have
>> is that I repeatedly get "tar: Skipping to next header" when untaring
>> the file.
> Which tar are you using to do the restore: native solaris /usr/bin/tar
> (or /usr/sbin/static/tar), gnu tar, pax? How are you supplying the 30
> GB file to the restoring tar, stdin as a file on the command line ...?
>
> Do you have any compression in the picture? Also are you moving
> between architectures or little to big endian machines?
>

------------------------------------------------------------------------------
This SF.net email is sponsored by 

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to