[BackupPC-users] rsync exclusion list - apply to multiple shares

2016-09-19 Thread cardiganimpatience
Just following up on this because I got a very useful reply from Holger which explained that a variable can be used to hold a list of excludes, but noted that doing so will break the ability to use the GUI. If the GUI is used to edit a host's config after manually setting a variable all changes

[BackupPC-users] rsync exclusion list - apply to multiple shares

2016-08-19 Thread cardiganimpatience
I was under the incorrect assumption that specifying an asterisk would apply the exclusion list to all ShareNames until I struggled with making it work and re-read the tutorial [1] where it specifies that the asterisk "means it applies to all shares that don't have a specific entry".

[BackupPC-users] three hundred thousand directories 'created' during backup

2016-08-04 Thread cardiganimpatience
On 04 Aug 2016 14:04, Adam Goryachev wrote: I can't comment on the rest, but directory entries are always created, because backuppc needs them for the backup structure (and there is no disk saving/not possible to hard link them)... Makes perfect sense! Thanks for the response Adam.

[BackupPC-users] three hundred thousand directories 'created' during backup

2016-08-04 Thread cardiganimpatience
Backups are taking about three hours for a particular fileserver and records indicate that over 300k new directories are being created every run. I opened the XferLOG in a browser and searched for the word "create d" which catches every newly-created directory. The count was 348k matches. But

[BackupPC-users] merge incrementals with full

2016-07-15 Thread cardiganimpatience
To relieve the impact of backing up hundreds of gigabytes in a short time, I added several larger folders to the exclude list for a share. Then each day I would remove a couple from the exclude list to add more data to the backup. What's happening is that backup #12 is trying to go back to

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-06-10 Thread cardiganimpatience
On 08 Jun 2016 20:32 Juergen Harms wrote: I have added a corresponding issue at github - I am not sure whether the picking up of access rights to the contents at sourceforge is successfully completed. That makes sure that your remark is not lost. (

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-06-08 Thread cardiganimpatience
Giving this another go and crossing my fingers. Using tar to seed data from a local file-level backup into the BackupPC pool. I realized that I had some but not all the plus signs removed from the tar command lines. This is mentioned several times in emails (eg:

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-06-06 Thread cardiganimpatience
On 06 Jun 2016 21:06, Carl Wilhelm wrote: A negative result is still a result. Thanks for reporting your result. Good point. I'm glad to contribute to the collective knowledge :) +-- |This was sent by itism...@gmail.com via

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-06-06 Thread cardiganimpatience
Well that was a huge waste of time. Attempting to import the backups from the local store using this TarClientCmd: /usr/bin/sudo $tarPath -cvf - -C /home/backup/$host/cur$shareName . --totals did appear to import all the files, but when I switch it back to rsync and scheduled another backup

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-06-02 Thread cardiganimpatience
Just following up on my efforts to import a host's existing file-level backups located under /home/backup/hostname/cur/ on the BackupPC server into the BackupPC pool located in /home/BackupPC/pool. I think I can do what I need without additional scripts or efforts to tar.gz the current files.

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-05-17 Thread cardiganimpatience
Johan Ehnberg wrote: To do it directly from directories, you could try changing 'zcat' to 'tar --strip-components=X' and point it at the directory. Set X to match the number of path components that should not be included for the archive to make the structure equal to that of the actual host to

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-05-17 Thread cardiganimpatience
, is to ensure that the paths that you get in BackupPC from the seeding match those that you get when backing up the actual host. I updated the script with improved documentation with the help of your experiences. Thanks! Best regards, Johan On 2016-05-14 00:03, cardiganimpatience wrote

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-05-13 Thread cardiganimpatience
method here: http://johan.ehnberg.net/backuppc-pre-loading-seeding-and-migrating-moving-script/ You may be able to use tar with --strip-components to work around tar extra paths on the fly. Good luck! Johan On 2016-05-06 17:19, cardiganimpatience wrote: BackupPC is installed and working

[BackupPC-users] Migrate local data into BackupPC pool for remote client

2016-05-06 Thread cardiganimpatience
BackupPC is installed and working great for new hosts. Is there a way to take the hundreds of GB from old hosts that exist on the backup server and import them into the BackupPC storage pool? The old backup system uses rsync to dump all files to a local disk on the same server where BackupPC