Hi, as i was realy busy in the past year i haven't try your config till now,
and i wanted to let you know that everything worked flawlessly. Thnaks again.
Misu
+--
|This was sent by egrim...@yahoo.com via Backup Central.
On 02/21 06:16 , Dennis Blewett wrote:
2011-02-21 18:11:04 Can't create a test hardlink between a file in
/var/lib/backuppc/pc and /var/lib/backuppc/cpool. Either these are
different file systems, or this file system doesn't support hardlinks, or
these directories don't exist, or there is a
On 02/21 10:03 , Dennis Blewett wrote:
I'm using the web interface with localhost and tar as the Xfer method.
Let's say I have these folders:
/home/workstation/Desktop
/home/workstation/Documents
/office
/research
And I want all of those to be listed in /.
So, when I look at the
On 2/22/2011 9:17 AM, Rob Morin wrote:
So i have this server that started its backup at 10pm Yesterday. The
logs below say it finished @ 1:52, however on the status page it still
shows that its running and that it started at 5:50 am, my server has a
high load with a PID that accompanies the
hi,
Right from the documentation:
For each complete, good, backup, BackupPC_link is run. To avoid race
conditions as new files are linked into the pool area, only a single
BackupPC_link program runs at a time and the rest are queued.
BackupPC_link reads the NewFileList written by BackupPC_dump
Hi,
Glad to know you got this solved; but did you try making a hardlink on
that filesystem yourself to see if that was indeed the problem?
On Debian, simply re-installing the backuppc-package does not recreate
folders in /var/lib/backuppc, like pc, cpool, pool etc.
You'll have to
Ok thanks, I had another first full backup run and it took 14 hours to
complete, it was under 125 gigs...
Thanks for the info Mark Les
Rob Morin
Systems Administrator
Infinity Labs Inc.
(514) 387-0638 Ext: 207
-Original Message-
From: Mark Maciolek [mailto:macio...@unh.edu]
Sent:
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
data (depending on how much memory you have); you will likely run out of
gregwm backuppc-us...@whitleymott.net wrote on 02/22/2011 11:26:51 AM:
this issue sure comes up alot, and perhaps i should just keep quiet
since i personally am in no position to do it or even go off looking
for an rsync forum, nor do i have any knowledge of just how convoluted
the rsync
gregwm wrote at about 10:26:51 -0600 on Tuesday, February 22, 2011:
rsync'ing the BackupPC data pool is generally recommended against. The
number of hardlinks causes an explosive growth in memory consumption by
rsync and while you may be able to get away with it if you have 20GB of
On 2/21/2011 11:03 PM, Dennis Blewett wrote:
I'm using the web interface with localhost and tar as the Xfer method.
Let's say I have these folders:
/home/workstation/Desktop
/home/workstation/Documents
/office
/research
And I want all of those to be listed in /.
So, when I look at the
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical command to use on the
backuppc folder.
What I'm also curious about is if I should
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:
13,849 items, totalling 3.8 GB
It would appear that I have a feasible number of files. I'm not sure
how many more files I will have by the end of April, though.
I've read about that rsync -H would be a practical
13 matches
Mail list logo