On 4/5/24 17:40, Les Mikesell wrote:
You can use $Conf{ClientNameAlias} to point multiple hosts to the same
IP (or a resolvable name).
Yes, this works so long as I override the RsyncSshArgs in the host
config to include the correct port.
Thanks___
ay off base and there's a better way of configuring
all of this.
Ian
On 4/5/24 16:14, Robert Trevellyan wrote:
Unless I'm missing something, this seems to be important: 2024-04-05
15:28:10 Can't find host myhost.com <http://myhost.com/> via NS and
netbios
Have you tried
On 4/5/24 15:25, to...@tuxteam.de wrote:
On Fri, Apr 05, 2024 at 02:27:55PM -0400, Ian via BackupPC-users wrote:
Hi,
I've been using BackupPC for many years now, and have used /usr/bin/true as
the ping command [...]
Are you sure it's supposed to be /usr/bin/true and not just
ed backing up that host.
No output in the text file.
However using sudo to execute the file as backuppc user works as expected.
Ian
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/li
Hi,
I've been using BackupPC for many years now, and have used /usr/bin/true
as the ping command for hosts that can't respond to ping. However after
a reboot on January 31st, my server stopped obeying pingcmd and stopped
backing up those hosts due to lack of ping response. I am using
Backup
On 11 October 2012 18:00, Les Mikesell wrote:
> That looked normal to me - are they not being excluded in the backup
> results?
>
During the effort to get this working, I changed a few things before
posting, and it does appear to work. I just assumed it wasn't after not
seeing it excluded from
On 10 October 2012 23:10, Ian P. Christian wrote:
>
> Running: /usr/bin/ssh -q -x -l root my.hostname.com /usr/bin/rsync
> --server --sender --numeric-ids --perms --owner --group -D --links
> --hard-links --times --block-size=2048 --recursive --checksum-seed=32761
> --ignore-times
browsable if i remove the commented out line)? I have changed each of
the hosts im commenting outs $Conf{FullPeriod} to -1 so they wont try and do
nay more backups.
Thanks
Ian
--
ThinkGeek and WIRED's GeekDad team up fo
a relative newbie to backuppc, so please excuse any stupid questions :)
Many thanks
Ian
--
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
luck
On Sep 15, 2009, at 7:12 PM, Chris Robertson wrote:
> ...even though they have more than a mile of physical separation. I
> don't currently have good data as to the bandwidth utilization during
> backups (the DRBD config is set to limit it to 10M, which is about
> 110Mbit/sec with TCP overhead),
On May 12, 2009, at 6:14 PM, Ian Levesque wrote:
> I've got a client I'm backing up via rsync (BackupPC v3.1). I recently
> added a very large directory to the filesystem I'm backing up, which
> has many hundred thousand files inside it. I don't need to backup this
So my guess is that
the rsync is evaluating all of the directory's contents rather than
excluding en masse.
When I run a manual rsync command with --exclude=/home/username/
directory, the file list takes a relatively short time. I figure that
the rsync module isn't as smart?
Any insight
I assume this is a success.
If it is a failure, then how do I troubleshoot this?
If it is a success, then why am I getting the backuppc error?
Note: I'm using version 2.1.2pl1 from the ubuntu repos. I would be glad
to update if this would help.
Tha
13 matches
Mail list logo