Re: [BackupPC-users] Exclude by file size

2015-05-11 Thread Holger Parplies
Hi,

Marios Zindilis wrote on 2015-05-11 19:02:36 +0300 [Re: [BackupPC-users] 
Exclude by file size]:
 You can -most probably- do that with the --max-size option of rsync,

I would tend to agree. Most probably means that the BackupPC side of the
transfer uses the Perl module File::RsyncP rather than native rsync, and this
module doesn't implement all valid rsync options. From what the man page says,
the --max-size option would actually need to be implemented by File::RsyncP
(rather than the remote native rsync), so you'd need to test and see if it is.

Of course, if that doesn't work, you always have the option of excluding the
files you are having problems with by name (BackupFilesExclude). I realize
that this doesn't automatically adapt to new large files appearing, which will
cause the same problem all over again, but it would be a way to get your
backup running again. The question to think about is: do you really want
backups of these files, or don't you? A size limit is not an answer to this
question, it's a workaround for an unsuited network link.

This seems to be another instance of a common problem: how do I handle backup
volume that [initially] won't complete over the limited bandwidth of the
network link? The part you'd need to solve yourself in any case would be to
get copies of the files in question to somewhere on your BackupPC server -
assuming you want them in your backup. Most likely solutions would be
sneakernet or some manual invocations of 'rsync -z ...'.
I'll try to find some time to work on integrating those copies into an
existing backup, so the next backup could use them as reference and avoid the
network transfer or limit it to the changes in the files.

Hope that helps.

Regards,
Holger

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC fails to start in daemon mode on Ubuntu 14.04

2015-05-11 Thread Stoyan Stoyanov
Hello,

After upgrading from Ubuntu 12.04 (backuppc 3.2.1-2ubuntu1.1) to Ubuntu
14.04 (backuppc 3.3.0-1ubuntu1), BackupPC no longer starts in daemon mode.
It runs fine in foreground though. I tried with a fresh server install and
still the same thing. The child dies immediately after being forked and
exits when the parent process exits. Here's what it looks like if I add
sleep 60; right after forking the child process and before the parent
exiting exit if ($pid);

backuppc 16074 15421  0 15:55 pts/30   00:00:00 /usr/bin/perl -w
/usr/share/backuppc/bin/BackupPC -d
backuppc 16076 16074  0 15:55 pts/30   00:00:00 [BackupPC] defunct

Same piece of code works fine if extracted and used by itself. Is there
anyone running BackupPC on Ubuntu 14.04 having the same problem?

Thanks,
Stoyan
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude by file size

2015-05-11 Thread Marios Zindilis
You can -most probably- do that with the --max-size option of rsync, check
this page for hints:
http://serverfault.com/questions/105206/rsync-exclude-files-that-are-over-a-certain-size

You will need to pass that as additional parameters to the backup command.
If you can't find out how to do that, please ask again.

On Mon, May 11, 2015 at 5:34 PM, supp...@vip-consult.co.uk wrote:

   Hi all

 I have a problem with a remote machine which has few large files (2-3gb )
 over a 10mb link this transfer never completes and the next day it tries
 again .
 It hangs on this transfer and never completes the rest of the backup.

 Is there a way to force backing up these large files last or if not seutp
 osme global exclude rule NOT to backup files larger than 

 Kind Regards Chris


 --
 One dashboard for servers and applications across Physical-Virtual-Cloud
 Widest out-of-the-box monitoring support with 50+ applications
 Performance metrics, stats and reports that give you Actionable Insights
 Deep dive visibility with transaction tracing using APM Insight.
 http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Marios Zindilis
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/