Hey Scott,
this may sound weird but have pretty much the same setup as you and
we're finding the cause of our terrible backup speeds to be a problem
with ssh speeds on centos.
We don't know why this is happening yet but are working on it. Also
we're on centos 4.4.
Anyway try it yourself and see
Hello,
I installed BackupPC in a test environment and it works quite good backing up a
windows server and a suse linux machine via rsync.
I only have 2 problems :
1. How do I tell BackupPC to start the Full backup for example at Friday
night at 22:00 or Saturday morning at 02:00,
Hi everyone.
I've just wrote a small script to save the data of a backuppc 3.0 server
to a remote *NIX host (offsite backup, uses mainly rsync over ssh, but
also need a small shell, at least for the command cat on the remote
host).
This script is written in perl and depends on the module
Hi,
megaram networks wrote on 02.05.2007 at 12:30:08 [[BackupPC-users] Howto tell
BackupPC to start Full Backup at a specific weekday]:
I only have 2 problems :
your lines are not wrapped and spaces appear in front of your commas? ;-)
1. How do I tell BackupPC to start the Full backup
Hi,
Jesse Proudman wrote on 01.05.2007 at 19:18:13 [[BackupPC-users] Making Nodes
In A Cluster Backup At Different Times]:
What's the best way to make sure that two specific nodes are not
being backed up at the same time?
set $Conf {MaxBackups} to 1?
Regards,
Holger
On May 2, 2007, at 5:59 AM, Jamie Lists wrote:
Hey Scott,
this may sound weird but have pretty much the same setup as you and
we're finding the cause of our terrible backup speeds to be a problem
with ssh speeds on centos.
We don't know why this is happening yet but are working on it. Also
Carl Wilhelm Soderstrom wrote:
When doing a backup, the whole thing seems to just stall for no reason.
While watching top, I noticed that the system is idle and BackupPC_dump
is nowhere on the list of active processes. This will last for several
minutes before it shows up again and gets
On Wed, 25 Apr 2007, Ski Kacoroski wrote:
Most recent stats are 1300 clients with about 4TB data across all
clients, 8 BackupPC servers with 3.4TB of data (love the hardlinks and
compression)
Just curious - is there a nifty way to access all the servers via a single
web interface? :)
Scott wrote:
One goal I have is to not have to modify the OS X clients at all.
Meaning, I should be able to do a fresh 10.4 install, apply the Apple
updates, then enable whatever access is needed before I can start
recovery. I would rather not have to remember to install xtar or what-
Well that'd be the easy way, but if we do that it will take forever
for our network to be backed up... Any other ideas?
--
Jesse Proudman, Blue Box Group, LLC
On May 2, 2007, at 4:34 AM, Holger Parplies wrote:
Hi,
Jesse Proudman wrote on 01.05.2007 at 19:18:13 [[BackupPC-users]
Hi there,
I've just installed BackupPC at my company having happily used it at
home for some time. In both locations it's used in a mixed OS
environment, but at work we've decided to use SMB shares to read from
everyone's PCs for ease-of-use reasons. As such we have a backuppc
domain user
Matt Godbolt wrote:
Hi there,
I've just installed BackupPC at my company having happily used it at
home for some time. In both locations it's used in a mixed OS
environment, but at work we've decided to use SMB shares to read from
everyone's PCs for ease-of-use reasons. As such we have a
Use blackout periods in local configuration files.
cheers,
ski
On Wed, 2 May 2007 08:46:55 -0700 Jesse Proudman
[EMAIL PROTECTED] wrote:
Well that'd be the easy way, but if we do that it will take forever
for our network to be backed up... Any other ideas?
--
Jesse Proudman, Blue
Hi,
Jesse Proudman wrote on 02.05.2007 at 08:46:55 [Re: [BackupPC-users] Making
Nodes In A Cluster Backup At Different Times]:
On May 2, 2007, at 4:34 AM, Holger Parplies wrote:
Jesse Proudman wrote:
What's the best way to make sure that two specific nodes are not
being backed up at the
What's the best way to make sure that two specific nodes are not
being backed up at the same time?
set $Conf {MaxBackups} to 1?
Well that'd be the easy way, but if we do that it will take forever
for our network to be backed up... Any other ideas?
If you use
Hi,
Josh Marshall wrote on 03.05.2007 at 08:38:50 [Re: [BackupPC-users] Making
Nodes In A Cluster Backup At Different Times]:
What's the best way to make sure that two specific nodes are not
being backed up at the same time?
set $Conf {MaxBackups} to 1?
Well that'd be
If you use the rsyncd method you can limit the number of simultaneous
connections.
In rsyncd.conf set:
max connections = 1
yes, but that's on the side of the backed up host, meaning it would only
work to ensure no two hosts pointing to the same machine via
ClientNameAlias were
Hi all,
I was just happening to look at the log file for the system and noticed
that it is FULL of lines like these:
2007-05-02 01:00:12 dhcp 192.168.0.50: sh: -c: line 0: syntax error near
unexpected token `0x933d2b8'
2007-05-02 01:00:12 dhcp 192.168.0.50: sh: -c: line 0: `ARRAY(0x933d2b8)'
Hi,
Scott wrote on 02.05.2007 at 11:08:01 [Re: [BackupPC-users] BackupPC and OS X]:
On May 2, 2007, at 9:02 AM, Holger Parplies wrote:
Due to the implementation of pooling your second full backup may be
much faster than the first: [...]
This sounds like it just skewed the results of my
Hi,
Jason M. Kusar wrote on 02.05.2007 at 21:04:37 [[BackupPC-users] dhcp syntax
error]:
I was just happening to look at the log file for the system and noticed
that it is FULL of lines like these:
2007-05-02 01:00:12 dhcp 192.168.0.50: sh: -c: line 0: syntax error near
unexpected token
20 matches
Mail list logo