Transitioning from BPC 3.x to 4.x there seem to be some syntactic changes
regarding rsync & ssh commands. I was able to follow documentation that I think
lived on SourceForge where it was suggested to configure a non-root account for
ssh'ing and running rsync under sudo. After some
I am new to git-based installations and need some help. I cloned the three
projects and I cd into the backuppc folder to run perl configure.pl and it
replies with:
[Cent-7:root@hostname backuppc]# perl configure.pl
You need to run makeDist first to create a tarball release that includes an
backuppc-xs first. Also, makeDist needs a
--version argument.
The wiki has an example
script<https://github.com/backuppc/backuppc/wiki/Installing-BackupPC-4-from-git-on-Ubuntu-Xenial-16.04-LTS>
for building from git.
Craig
On Tue, Aug 7, 2018 at 10:26 PM Mike Hughes
ma
Hi BackupPC users,
Just curious if anyone else has made the changes necessary to use a non-root
user account in the 4.X versions and run into any difficulty with incomplete
backups?
Thank you!
From: Mike Hughes
Sent: Friday, August 10, 2018 14:39
To: backuppc-users@lists.sourceforge.net
Mystery solved. The defaults included --one-file-system. Removed that and all
partitions are being backed up
From: Mike Hughes
Sent: Monday, August 13, 2018 08:54
To: 'General list for user discussion, questions and support'
Subject: RE: RsyncClientCmd --> RsyncSshArgs
Hi BackupPC us
ook in the release page for each project to download
>> the latest tarballs. That will allow you to skip the first few steps.
>> But, yes, you still need a working compiler environment to build rsync-bpc
>> and backuppc-xs.
>>
>> Craig
>>
>> On Wed, Aug
It’s a cloud service so I’m less concerned with the results of the thrashing
From: Greg Harris
Sent: Tuesday, August 14, 2018 08:53
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] Which file system for data pool?
You are probably already aware of
>-Original Message-
>From: Johan Ehnberg
>Sent: Tuesday, August 14, 2018 07:29
>To: backuppc-users@lists.sourceforge.net
>Subject: Re: [BackupPC-users] Which file system for data pool?
>
>
>On 08/14/2018 02:44 PM, Tapio Lehtonen wrote:
>> I'm building a BackupPC host, with two SSD disks
get the C headers
etc. Many linux libraries have two packages - the runtime library, and another
(generally with "devel" in the name) for compiling and linking code.
Craig
On Wed, Aug 8, 2018 at 10:19 AM Mike Hughes
mailto:m...@visionary.com>> wrote:
Thanks for the reply.
I think I've discovered a new level of failure. It started off with these
errors when attempting to rsync larger files:
rsync_bpc: failed to open
"/home/localuser/mysql/hostname-srv-sql.our_database.sql", continuing: No space
left on device (28)
rsync_bpc: mkstemp
>-Original Message-
>From: Johan Ehnberg
>Sent: Friday, August 24, 2018 09:25
>To: backuppc-users@lists.sourceforge.net
>Subject: Re: [BackupPC-users] ver 4.x split using ssd and hdd storage - size
>requirements?
>
>On 08/24/2018 04:52 PM, Mike Hughes
er is on a large enough partition.
From: Mike Hughes
Sent: Friday, August 24, 2018 09:56
To: backuppc-users@lists.sourceforge.net
Subject: RE: [BackupPC-users] ver 4.x split using ssd and hdd storage - size
requirements?
>-Original Message-
>From: Johan Ehnberg mailto:jo...@moln
I'm seeing out-of-storage errors from rsync_bpc when transferring large (4 GB)
files:
rsync_bpc: failed to open
"/home/localuser/mysql/hostname-srv-sql.our_database.sql", continuing: No space
left on device (28)
The volume supporting the 'pc' folder has 6 GB free and lives an SSD under
Hi BackupPC users,
Similar questions have come up a few times but I have not found anything
relating to running multiple pools. Here's our setup:
- On-prem dev servers backed up locally to BackupPC (4.x)
- Prod servers backed up in the cloud to a separate BackupPC (4.x) instance
I'd like to
a remote copy of the cpool,
pc and conf directories, to a place that BackupPC doesn't back up.
Craig
On Thu, Oct 11, 2018 at 10:22 AM Mike Hughes
mailto:m...@visionary.com>> wrote:
Hi BackupPC users,
Similar questions have come up a few times but I have not found anything
relating to r
t 8:52 PM, Mike Hughes wrote:
>
> Another related question: Does it make sense to use rsync's compression when
> transferring cpool? If that data is already compressed, am I gaining much by
> having rsync try to compress it again?
> Thanks!
> From: Mike Hughes
> Sent: Friday,
A host was created to duplicate the cpool from another BackupPC server. It was
set to skip compression and successfully filled the uncompressed pool with
~300GB of data. On the advice of others, I decided to use rsync in a cronjob
instead, so my intent was to delete this host and its data.
Another related question: Does it make sense to use rsync's compression when
transferring cpool? If that data is already compressed, am I gaining much by
having rsync try to compress it again?
Thanks!
From: Mike Hughes
Sent: Friday, October 12, 2018 8:25 AM
On Mon, Oct 22, 2018 at 1:21 PM Mike Hughes
mailto:m...@visionary.com>> wrote:
A host was created to duplicate the cpool from another BackupPC server. It was
set to skip compression and successfully filled the uncompressed pool with
~300GB of data. On the advice of others, I decided
>Jamie Burchell wrote on 2018-10-30 09:31:13 - [[BackupPC-users] BackupPC
>administrative attention needed email incorrect?]:
>> [...]
>> Yesterday, I received the following email from the BackupPC process:
>> [...]
>> > Yesterday 156 hosts were skipped because the file system containing
>> >
Hi Daniel,
It sounds like the exclusion rules aren’t working as you expect. If they were I
don’t think you’d see the errors even if the pool files had a problem since
they’d never be compared. I ran into problems when I set up mine too. If I
recall my confusion was around identifying the share
Hi Steve,
It looks like they are stored using reverse deltas. Maybe you’ve already seen
this from the V4.0 documentation:
* Backups are stored as "reverse deltas" - the most recent backup is always
filled and older backups are reconstituted by merging all the deltas starting
with the
Yeah Ed, sorry this isn't clearer on the main page but Craig discourages
building from source. If you're on Cent/RHEL, use this repo instead:
https://copr.fedorainfracloud.org/coprs/hobbes1069/BackupPC/
From: Ed Burgstaler
Sent: Monday, November 26, 2018 12:38
To:
Michael,
Condescending and belittling treatment of others in this list is not the norm.
Your personal attacks are unwarranted and unhelpful.
Since you reached out and asked for help in relating to others I will share
with you this piece from Dr. Phil Agre which helped me immensely when I was
>From: Karol Jędrzejczyk
>I'm using BackupPC to back up a bunch of servers only. In this scenario
>any transfer error is critical.
Hi Karol,
I like to receive reports of any failures so I install BackupPC_report [1] on
the BackupPC server then I explore the logs from any that error out. I call
On Wed, 2019-06-05 at 11:51 -0600, David Wynn via BackupPC-users wrote:
> debug1: Sending command: 192.168.1.6 rsync --server --sender
> -slHogDtprcxe.iLsfxC
> sh: 192.168.1.6: not found
This line tells us that your shell (sh) is trying to run the command:
"192.168.1.6" That looks like an IP
Working to add a few Windows clients to our BackupPC system. I have
passwordless ssh working and when I try kicking off a backup using the GUI it
fails immediately.
Logs show:
2019-08-15 11:09:29 full backup started for directory /cygdrive/c
2019-08-15 11:09:31 Got fatal error during xfer (No
I use a DumpPreUserCmd to kick off a database dump script. This script used to
log its actions within each server's BackupPC log so I could verify which
databases and tables were dumped, which were skipped, etc.
It looks like a change was made which stops these logs from being written to
the
need to
confirm that the backup will hard-fail on an error from my user cmd, which is
required for me to be aware that a database or table dump was not successful.
On Thu, 2019-08-15 at 14:54 -0500, Mike Hughes wrote:
I use a DumpPreUserCmd to kick off a database dump script. This script used
On Wed, 2019-09-04 at 23:00 -0700, Craig Barratt via BackupPC-users wrote:
In addition to the higher log level, it would be helpful to see the rsync
command being run. Is there anything in the XferLOG file?
Craig
On Wed, Sep 4, 2019 at 6:44 PM Michael Huntley
mailto:mich...@huntley.net>>
No responses. Too much detail? Let me rephrase it:
Windows rsync backup no worky!
plz halp!!!
:-D
On Thu, 2019-08-15 at 14:44 -0500, Mike Hughes wrote:
Working to add a few Windows clients to our BackupPC system. I have
passwordless ssh working and when I try kicking off a backup using the GUI
This seems like a dumb problem to have but I'm struggling to figure out how to
tell BackupPC to ssh into the cygwin client as a local user.
The backuppc account was created locally on the machine to be backed up, thus
it does not exist in AD. I am able to log in using other AD accounts with no
Hi BackupPC users,
I added a new host and it is encountering thousands of errors and won't
finish a backup. Instead it is dumping errors such as:
file has vanished: "/etc/iscsi/initiatorname.iscsi"
rsync_bpc: fstat ... No such file or directory (2)
rsync_bpc: stat ... No such file or directory
On Thu, 2019-09-19 at 15:20 +0100, Cogumelos Maravilha wrote:
> There's only a problem, the /home folder in some server aren't not
> getting rsynced.
...
> --one-file-system
Have you verified that the systems which are not working as expected
don't have separate partitions for /home?
No, that is the default setting in BPC. So if your /home is on a separate
partition you either need to remove that setting, or add the /home partition as
a backup Target in addition to /.
Whichever is your best option is up to you.
On Sep 29, 2019 06:27, Bob Wooden wrote:
Thanks, Michael.
uot; is on /dev/md2. Both on Linux
> (Ubuntu 18.04LTS) mdadm arrays.
>
> Am I wrong? Doesn't "include" override any "exclude" settings?
>
>
>
> On 9/29/19 9:02 AM, Mike Hughes wrote:
> > No, that is the default setting in BPC. So if your /home i
it helps.
>
> Kind regards,
>
> Jamie
> --
> From: Mike Hughes [mailto:m...@visionary.com]
> Sent: 30 September 2019 11:54
> To: General list for user discussion, questions and support <
> backuppc-users@lists.sourceforge.net>
> Subject: Re: [BackupPC-use
and the output of DumpPreUserCmd is irrecoverable. This makes it
very hard to troubleshoot a problem with no log output.
@CraigBarratt, Please consider reversing/revising this update!
On Wed, 2019-08-28 at 08:33 -0500, Mike Hughes wrote:
> On Thu, 2019-08-15 at 14:54 -0500, Mike Hughes wr
',
'--log-format=log: %o %i %B %8U,%8G %9l %f%L',
'--stats',
'--acls',
'--xattrs'
];
On the client machines in /etc/sudoers:
backuppc ALL=NOPASSWD: /usr/bin/rsync --server *
Kind regards,
Jamie
--
From: Mike Hughes [mailto:m...@visionary.com<mailto:m...@visionary.com>]
Se
Hi Gandolf,
This is what I use to clean up disk space:
nohup /usr/share/BackupPC/bin/BackupPC_nightly 0 255 &
If I want to watch it work I'll use this:
tail nohup.out -F
Usually finishes in 5-10 minutes.
--
Mike
On Thu, 2019-12-19 at 16:08 +0100, Gandalf Corvotempesta wrote:
Hi to all.
Hi, we're currently syncing our cpool to an off-site location on a weekly
basis. Would it be feasible to only sync the latest of each backup rather than
the entire pool?
To elaborate, on Saturdays we run an rsync of the entire cpool to another
server to provide disaster recovery options. Is it
Hi Taste (nice domain name ),
Are you including the $sshPath prefix as shown in the example?
$Conf{DumpPreUserCmd} = '$sshPath -q -x -l root $host /usr/bin/dumpMysql';
From: Taste-Of-IT
Sent: Friday, August 21, 2020 5:34 PM
To:
For what it's worth, I was able to resolve this (for now) by updating CPAN
itself. I noticed when running certain commands that it complained that CPAN
was at version 1.x and version 2.28 was available. It suggested running:
install CPAN
reload cpan
But those are clearly not bash
Thanks so much Richard! Will COPR installations auto-update via yum repository
updates or do we need to specifically run a COPR update manually?
From: Richard Shaw
Sent: Monday, June 22, 2020 7:01 PM
To: General list for user discussion, questions and support
-users] [BackupPC-devel] BackupPC 4.4.0 released
On Thu, Jun 25, 2020 at 10:02 AM Mike Hughes
mailto:m...@visionary.com>> wrote:
Certainly a mismatch. Here's my output. Hopefully it formats cleanly. How can I
fix this while waiting for the patch to roll out?
Well, I'm not sure how to clean
Daily report:
https://github.com/moisseev/BackupPC_report
From: backu...@kosowsky.org
Sent: Thursday, June 25, 2020 8:42 AM
To: General list for user discussion
Subject: [BackupPC-users] FEATURE REQUEST: More robust error reporting/emailing
It would be great if
Certainly a mismatch. Here's my output. Hopefully it formats cleanly. How can I
fix this while waiting for the patch to roll out?
# head /usr/share/BackupPC/bin/BackupPC -n1
#!/usr/bin/perl
# grep "use\ lib" /usr/share/BackupPC/bin/BackupPC
use lib "/usr/share/BackupPC/lib";
# which cpan
_64
From: Richard Shaw
Sent: Tuesday, June 23, 2020 10:47 AM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] [BackupPC-devel] BackupPC 4.4.0 released
On Tue, Jun 23, 2020 at 8:24 AM Mike Hughes
mailto:m...@visionary.
Hi BackupPC users,
I'd like to maintain several templates for exclusion lists to be available
across all clients. For example, the LINUX_WWW template might exclude these
directories:
/proc
/sys
/home/this_user
/mnt/this_network_share
/tmp
etc.
And the LINUX_SQL template might exclude:
/proc
Hi, Ghislain! That's a lot fancier than I was going for, but I can certainly
use what you've shared to solve my question.
Thank you for responding!
From: Ghislain Adnet
Sent: Thursday, May 27, 2021 7:41 AM
To: Mike Hughes
Cc: General list for user discussion
Hi Dave,
You can always break a backup job into multiple backup 'hosts' by using the
ClientNameAlias setting. I create hosts based on the share or folder for each
job, then use the ClientNameAlias to point them to the same host.
From: Dave Sherohman
Sent:
I don't know exactly what kind of reporting you're looking for, but I've been
using this for years:
https://github.com/moisseev/BackupPC_report
I have an ansible script that pulls the code from above, appends a .pl
extension to the executable, then puts it into /usr/share/BackupPC/bin/.
Then
You'll need to prepend your command with the following:
$sshPath -q -x -l backuppc $host curl...
or
$sshPath -q -x -l root $host curl...
Click on the hyperlink DumpPreUserCmd for more details and examples, such as:
'$sshPath -q -x -l root $host /usr/bin/dumpMysql';
Hi Chiel,
Since you're targeting /var, the directory to exclude would be /log, not
/var/log. Be sure to add it directly under the BackupFilesExclude config for
the /var share.
From: chiel
Sent: Thursday, February 24, 2022 7:16 AM
To:
Hi Edward,
I'm not sure how much help this list can be as we are just a group of people
who administrate our own installations of BackupPC. It's a very flexible
solution that can handle massive amounts of backup and runs maintenance-free
for years, so it's not a surprise if it got forgotten
This is one I found but have not tried:
https://gist.github.com/phoenix741/99a5076569b01ba5a116cec24a798d5f
It mentions being updated for updated for 4.x in 2017 which is when 4.0 was
released.
From: backu...@kosowsky.org
Sent: Thursday, November 17, 2022 8:44 AM
Hi Tony,
If you're stuck at the ssh part, you won't be successful running a backup as
making the connection is the first step.
What happens when you try to ssh to this client from the backuppc account on
the backup server? Assume you've already tried ssh-copy-id but it's failing.
I'm having
Hi Ian,
I'm not sure why it would suddenly stop working for you as that seems like a
completely legitimate solution.
FWIW, this is the command I use for unpingable clients:
$Conf{PingCmd} = '/bin/echo $host';
Best of luck!
From: Ian via BackupPC-users
Sent:
58 matches
Mail list logo