Consider looking at SAS drives..
On Thu, Dec 31, 2009 at 7:47 PM, Peter Vratny wrote:
> Gerald Brandt wrote:
> > How many servers and how much data do you backup using BackupPC?
>
> There are 89 hosts that have been backed up, for a total of:
> * 580 full backups of total size 5596.07GB (prior t
you can take yourself off you goose
On Wed, Nov 11, 2009 at 10:14 AM, paul wilkinson wrote:
> cannot be used for xp take me off your mailing list please
>
> paul
>
>
>
>
>
> --
> Let Crystal Reports handle the reporting
I would say its a file permissions error on the server in which your
restoring to.
On Sun, Nov 1, 2009 at 9:06 PM, Micha Silver wrote:
> I am unable to restore to linux servers. Backups are running fine, and I
> rechecked that I can ssh as user backuppc from the backup server (Centos
> 5.4) dire
We are a small hosting company in Australia, and we use several backuppc
servers in our environment. Our main backup server backs up almost 600GB
worth of data over the lan each night, this is our pool:
- 15 full backups of total size 2090.82GB (prior to pooling and
compression),
- 24 in
Hi,
Have installed the latest build of the beta version and I've noticed some
things have disappeared from the left menu. Edit Config and Log File, and a
bunch of others aren't available, is that normal?
--
Best Regards,
Stephen
Sent from Sydney, Nsw, Australia
--
We're building a new backuppc server at the moment aswell, the box is using
8x 300gb SAS 10k drives in raid 10, the decision of whether to use raid5/6
or raid10 is difficult. At the moment raid10 with 1.1TB gives us 12-18
months before we reach our capacity, however with raid5 we have 2100gb
(rough
Hi,
Has anyone used or is using backuppc with WD raptors in a raid 10 array?
Just curious to know how performance is, compared with 10k SCSI or SAS
drives.
thanks
--
Let Crystal Reports handle the reporting - Free Crystal
yep ok I've just done that.
On Tue, Jul 7, 2009 at 9:28 PM, Adam Goryachev <
mailingli...@websitemanagers.com.au> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Stephen Vaughan wrote:
> > I definitely don't have updated installed on the box. I
kd/0
351 root 15 -5 000 S1 0.0 1296:21
kswapd0
12263 backuppc 20 0 11132 8112 2060 D1 0.2 0:21.20
BackupPC_nightl
12266 backuppc 20 0 11024 8084 2060 D1 0.2 0:24.10
BackupPC_nightl
On Tue, Jul 7, 2009 at 6:20 PM, Leen Besselink wrote:
> Stephen
iostat is also showing iowait fluctuating between 50-75 %
--
Enter the BlackBerry Developer Challenge
This is your chance to win up to $100,000 in prizes! For a limited time,
vendors submitting new applications to BlackB
going by the graphs it seems to be an IO issue, has anyone tried running
raid 10 with backuppc?
On Tue, Jul 7, 2009 at 1:57 PM, Les Mikesell wrote:
> Stephen Vaughan wrote:
> > I'm only running 2 concurrent backups. The disk is/was mounted with
> > noatime,data=journal.
The box isn't used for anything else..I can keep an eye on it tonight and
see whats causing the load in those first few hours..
On Tue, Jul 7, 2009 at 1:57 PM, Les Mikesell wrote:
> Stephen Vaughan wrote:
> > I'm only running 2 concurrent backups. The disk is/was mounted wi
..
On Mon, Jul 6, 2009 at 4:27 PM, Jeffrey J. Kosowsky
wrote:
> Stephen Vaughan wrote at about 15:11:43 +1000 on Monday, July 6, 2009:
> > Hi all,
> >
> > In everyone's opinion, which resource(s) does backuppc rely on the most?
> > cpu, memory or disk?
&
Hi all,
In everyone's opinion, which resource(s) does backuppc rely on the most?
cpu, memory or disk?
--
Best Regards,
Stephen
--
___
BackupPC-users mailing list
BackupPC-users
sorry, black out period is 6am to 9pm
On Wed, May 20, 2009 at 4:29 PM, Stephen Vaughan
wrote:
> Is it possible to prevent full backups from starting after a certain time?
> At the moment my wake up schedule is set for 9pm-5am with blackout 9pm-6am,
> one of the servers that we do a fu
Is it possible to prevent full backups from starting after a certain time?
At the moment my wake up schedule is set for 9pm-5am with blackout 9pm-6am,
one of the servers that we do a full backup on each week is running into the
black out period. It starts the job at 4am and doesn't finish until the
Yeah I can't see how this is a backuppc issue
On Thu, Mar 5, 2009 at 11:07 AM, Chris Robertson wrote:
> Nate wrote:
> > We seem to be routinely having this issue where the server backuppc
> > is running on throws a kernel panic and thus hard locks the
> > machine. It's completely random, someti
Why use Fedora at all?? Debian is your friend
On Sat, Oct 11, 2008 at 2:22 PM, Michael Mansour <[EMAIL PROTECTED]> wrote:
> Hi,
>
> > Hello all,
> >
> > For a number of years I have run a small home network of windows
> machines.
> > I recently decided to move my computing to Linux. I now have t
Yep you're right, thanks.
On Mon, Oct 13, 2008 at 12:12 PM, Craig Barratt <
[EMAIL PROTECTED]> wrote:
> Stephen writes:
>
> > Still seems to be creating a .raw archive:
>
> Yes. The archive is now compressed, in spite of the extension:
>
>/bin/csh -cf /usr/share/backuppc/bin/BackupPC_tarCrea
Still seems to be creating a .raw archive:
backuppc 25740 0.0 0.5 8724 5628 ?S11:51 0:00
/usr/bin/perl /usr/share/backuppc/bin/BackupPC_archive offsite-archive
archiveReq.25739.0
backuppc 25851 0.0 0.4 7320 4568 ?S11:57 0:00
/usr/bin/perl /usr/share/backuppc/
hmm okay. That sort of worked, I did the first change now I'm getting an
error for gzip:
2008-10-13 11:44:56 Archive failed (Error: gzip is not an executable program)
$Conf{ArchiveComp} = 'gzip';
$Conf{CompressLevel} = '1';
On Mon, Oct 13, 2008 at 11:28 AM, Craig Barratt <
[EMAIL PROTECTED]> wr
I take it nobody knows?
On Tue, Oct 7, 2008 at 3:28 PM, Stephen Vaughan <[EMAIL PROTECTED]>wrote:
> Firstly, what a fantastic piece of software BackupPC is, well done Craig
> and co!
>
> When I run BackupPC_archiveStart from the command line it's not using
> compr
Firstly, what a fantastic piece of software BackupPC is, well done Craig and
co!
When I run BackupPC_archiveStart from the command line it's not using
compression for the backup, the backup file is created using "raw", however
if I start the archive job via the web-interface it uses compression (g
What about with multiple full backups, say your full backup is 50gig and you
do a full backup once a week, will mean each full backup will use 50gigs of
space? Or does the pool do some other linking between the data contained in
each full backup?
On Thu, Sep 4, 2008 at 5:05 PM, Christian Völker <[
at 10:56 PM, Les Mikesell <[EMAIL PROTECTED]> wrote:
> Tino Schwarze wrote:
> > On Tue, Sep 02, 2008 at 11:34:44AM +1000, Stephen Vaughan wrote:
> >> Hmm yeah I've a bit of a play with the nfs client side, the nfs server
> is
> >> purely web based, so it's
AIL PROTECTED]> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Stephen Vaughan wrote:
> > Hi,
> >
> > We have backuppc running on a server, with an NFS mount to a NAS device.
> > Everything is gigabit, there is only 1 nic in the backuppc server
&
Hi,
We have backuppc running on a server, with an NFS mount to a NAS device.
Everything is gigabit, there is only 1 nic in the backuppc server connected
to the switch, so traffic comes in from the servers we're backing up and
then straight back out the same nic to the NFS mount.
I'm finding signi
I'm performing a backup on a folder which contains about 1,000,00 files
spread over about 10,000 folders. I'm using rsync 2.6.4 on both the
backuppc server and the machine I am trying to backup. I can't workout
why I keep getting this problem... I'm doing an incremental backup and
it gets to a part
I'm getting the same problem now aswell... It creates the list of files
to be copied and then just shits itself with the following errors over
and over:
Can't open /var/lib/backuppc/pc/xxx.xxx.net/new//fopt/ for empty output
create
0
/ 0
On 4/4/06, Dan D Niles <[EMAIL PROTE
I keep getting these errors with random files. I am trying to backup a
server that has in excess of 1,000,000 files. The box has 2gigs of ram.
It just seems to crash on random files, doesn't matter whether they are
.html, pdf or whatever.. I've increased the timeout in the config.pl -
that did noth
I'm still having troubles backing up one of my servers. Below is a log of the error, I was getting a lot of "No such file or directory" found errors, mainly because i'd say the files were either deleted or moved since the rsync list was compiled.
Originally it was crashing because there wasn't enou
didn't fix it.. i think mine is a RAM issue, keep running out of memory so it just randomly crashesOn 3/10/06, Marcos Lorenzo de Santiago <
[EMAIL PROTECTED]> wrote:El Viernes, 10 de Marzo de 2006 03:58, Stephen Vaughan escribió:
SV > I am having nothing but trouble with doing back
I am having nothing but trouble with doing backups.. I have a backup i'm trying to, its at least 1,000,000 files.. it takes over an hour to build the list...and it seems to start coping files but then just randonmly exists, as shown below.. and it does it at different stages all the time, so its no
xited prematurely)
On 3/9/06, Craig Barratt <[EMAIL PROTECTED]> wrote:
Stephen Vaughan writes:> Is there anyway to get backuppc to continue to backup regardless of> errors? I had it backing up a 2gb db file, and during the transfer the> file was modified and backuppc recognised this an
Is there anyway to get backuppc to continue to backup regardless of errors? I had it backing up a 2gb db file, and during the transfer the file was modified and backuppc recognised this and aborted the backup.
Remote[1]: send_files failed to open misc/backups/netchant-db.blobdump: No such file or d
If I make a full backup say today, and I run 29 days of incrementals
then run a full backup on day 30, will the full backup fill in the
difference from the 29 days of incrementals+ the first full backup? Or
will it just download the entire full backup from the machine?-- Best Regards,Stephen
I'm having problems backing up one of my servers. The backup goes for
about 1 1/2 hours then it just cuts out and says timeout error. I've
got my timeout in rsyncd.conf set to 800. I've got ignore errors turned
on, I've got ignore nonreadable turned on.. but I just can't seem to
get it to backup. T
nah Centos is yuk, you want Debian or Ubuntu.. yum is about as slow as my grandmaOn 2/21/06, Les Mikesell <[EMAIL PROTECTED]
> wrote:On Mon, 2006-02-20 at 14:36, [EMAIL PROTECTED]
wrote:> I have tried Fedora core 5 (Core 4 doesn't have the right drivers for> my motherboard) and it's only in test
ok cheers guys will have a look at it on mondayOn 2/18/06, Dan Pritts <[EMAIL PROTECTED]> wrote:
On Thu, Feb 16, 2006 at 11:27:54PM -0600, Les Mikesell wrote:> On Thu, 2006-02-16 at 22:08, Stephen Vaughan wrote:
> > Does anyone know if it is possible to push a backup from a client
Does anyone know if it is possible to push a backup from a client TO
backuppc? I've got several boxes running rsyncd and backuppc calls them
to send data back and forth. I have a box that is firewalled and I want
to be able to still backup this machine, but in the other direction. So
the client mac
40 matches
Mail list logo