Olivier LAHAYE writes:
Is there a way to use 7zip (/usr/bin/7za) to compress files?
If yes whant variables controls that?
The compression performance looks good, but unless there is a
perl module interface for 7zip then there is no chance to
integrate it into BackupPC. Currently BackupPC uses
Kanwar Ranbir Sandhu writes:
Hello fellow BackupPCians,
I noticed that on each hosts summary page, the blackout period isn't
being printed. Instead, it stops short of printing the times. For
example:
Because mona has been on the network at least 2 consecutive
times, it
Gerald Richter writes:
I am using backuppc for over a year now without any problemes, recently I
started to get the following error when I do a full backup of a Windows XP
PC:
Call timed out: server did not respond after 2 milliseconds opening
remote file
Scott Gamble writes:
I know this has been touched on repeatedly - and from kinda glossing through
the archives I gather that its not a new idea at all, but is there any
reason why scheduling of backups couldn't be handled exclusively by cron
instead of the backuppc mechanism? The time
Christophe Faribault writes:
/backups/bin/BackupPC_archiveHost /backups/bin/BackupPC_tarCreate
/usr/bin/split /usr/bin/par2 archive-host -1 /bin/gzip .gz 0 /mnt/exthd 0
*
But I get this when I run it:
Writing tar archive for host archive-archive, backup #-1 to output file
Chris Stone writes:
Well, OK, I'll try to figure out the bigger issue on my own. How
about I just ask this, though...
Sorry, I haven't had time to check into this.
Is there a way to manually change the status of a partial backup to a
valid full backup?
Yes. Just edit the backups file
Chris Stone writes:
I needed to include the if (! -e $archpath) block in the script
since I found that if the backup and archive writing go on for more
than a couple hours, DumpPostUserCmd gets called again for some
reason, and the script would start writing to the same directory and
Carl Wilhelm Soderstrom writes:
On 09/03 06:22 , Hamish Guthrie wrote:
I am sorry to harp on about this, but, it gets back to my analysis of
rsync under backuppc a few months ago - I still think that we need to do
an implementation of File::RsyncP in C as opposed to perl.
Your point
Dan D Niles writes:
I tracked down the bug. When I changed the config file, it caused
a problem where something wasn't defined that should have been.
I uncommented the line
Eeeek! This is a serious bug. Thanks for tracking it down.
The problem is that when it removes multiple fulls in
I recently released BackupPC version 2.1.2 on SourceForge.
This release fixes a number of bugs in 2.1.1. See
http://backuppc.sourceforge.net.
I've attached the change log since 2.1.1.
Enjoy!
Craig
#
# Version 2.1.2, 5 Sep
Dan D Niles writes:
Les Mikesell writes:
I haven't peeked at the code, but I'd guess that it is checking
that the programs exist so it can give a reasonable error message
at startup time instead of waiting until some backup run needs
it (remember that you can override
Travis Fraser writes:
I have looked through the docs and found the command
BackupPC_serverMesg backup all
to queue all hosts for backup, but was wondering how to do this for just
one host. I would like to be able to trigger the backup from the
server's console.
BackupPC_serverMesg
David Relson writes:
Thanks for a great program! As a minor contribution, I've noticed an
html display flaw and am enclosing patches that fix the flaw in the 2
places I've seen it. There might be other places, notably in the other
Lang/*.pm files.
I'd sent these patches about 3 weeks
[EMAIL PROTECTED] writes:
Whenever I try to restore a whole bunch of files it fails with this
error after it restores about 4 files:
I did replace instances of Solaris tar with gtar.
Is the share read-only?
Craig
No it is not a read-only share .. considering it restores the
Roy,
I appreciate the update. Glad you relocated safely and I trust
the same is true for your family, friends and colleagues.
I hope things get back to normal quickly and I look forward to
helping with BackupPCd.
Regards,
Craig
Roy Keene writes:
Craig,
There has been no progress on
hbeaumont hbeaumont writes:
I've got a test server set up to backup two servers. Right now backuppc is
running :
-bash-2.05b# ps axuw | grep dump
backuppc 2084 1.4 5.1 508280 24688 ? S 10:00 1:49 /usr/bin/perl
/usr/local/BackupPC/bin/BackupPC_dump host1
backuppc 2085 0.2 62.7 758832
[EMAIL PROTECTED] writes:
Every time I do a full backup, Backup PC seems to hiccup on the last
file as such:
File /usr/local/backuppc/data//pc/lap2/XferLOG.0.z
Contents of file /usr/local/backuppc/data//pc/lap2/XferLOG.0.z, modified
2005-09-09 16:31:30
Running:
Julian Robbins writes:
I've been using BackupPC for a couple of years now and am very pleased
with it. Using vers 2.0.2 (yes I know I could do with updating it ;-) ).
But I am confused. When I set my config as below, I still get 7 full
backups, and 18 incrementals. The oldest Full backup
Jean-Christophe Pinoteau writes:
I am newbie on BackupPC and I am planning to use it to backup users
data on several hosts.
There is a question I couldn't find an answer for in the manual: how
does BackupPC deals with large files like databases ? When BackupPC
creates a new
Carl Wilhelm Soderstrom writes:
I'm running out of disk space on my backup server, and it's run out of space
on a couple of occasions. when it does this, some hosts 'forget' all their old
backups -- those backups no longer appear in the pc/hostname/backups file.
I know the backups.old file
timonin writes:
I have a trouble with running BackupPC
I've just made an install, edited the config file and tried to run it
for the first time.
In LOG file I get
unix bind() failed: Operation not permitted
I use rsync as transport, my OS is RH8 and data directory of
BackupPC(which
Les Mikesell writes:
On Tue, 2005-09-13 at 10:15, Justin Pessa wrote:
Thanks for your reply Les.
Can you then clarify for me what exactly is the purpose of the Archive
Host feature in BackupPC if the backups are not stored on an Archive
server?
I think it is a little confusing too,
Peter Padberg writes:
At this time Backuppc make 2 other backup,
but Backuppc does not restore my files and put the restore into queue!!
After I stop first the 1 and then the other backup, I thought Backuppc
starts now my restore!
But NO!
I starts BackupPC_nightly and I must wait for my
Andre Helwig writes:
how can i backup a sun with tar?
mc client conf for the sun is:
##sun config file
$Conf{XferMethod} = 'tar';
$Conf{TarShareName} = ['/etc'];
$Conf{TarClientCmd} = '/usr/bin/ssh -q -x -n -l root 10.0.0.3
/usr/sbin/tar -c -v -f - $shareName';
#EOF
it will not
Paul S. Gumerman writes:
This one is easy --- I fixed this on my installation just the other day.
Change this conf file setting:
$Conf{BackupZeroFilesIsFatal} = 1;
to
$Conf{BackupZeroFilesIsFatal} = 0;
That's right. However, the reason that the default is fatal is
that it really
m l writes:
Server: SUSE linux 9.1
running BackupPC 2.1.2
using XferMethod = rsyncd
rsync 2.6.2
client Linux:
RedHat EL 3 WS
rsync 2.6.2
Problem: I am able to backup with no problem.
But when it try to restore on the redhat workstation,
log file show: Starting restore and nothing
Clive Allen writes:
I have created the following Blackout Period which i want to go from
07:00 to 23:58 all days (Or in other words I want the backup to start at
00:00 every day)
What are the values of $Conf{BlackoutBadPingLimit} and
$Conf{BlackoutGoodCnt}? Are these in the per-PC config
Alex Schaft writes:
I've just configured a DHCP pool, and I now see jobs being queued like
BackupPC_dump -d 10.1.1.100
Is this a discovery of potential backup clients?
Yes, it scans the entire DHCP address range.
However, this is really an old feature since hosts can be usually
looked up
Andrew Zbikowski writes:
Option 1: symlink /var/lib/backuppc to your backup drive.
Option 2: mount your backup drive to /var/lib/backuppc
The path is set at build/install time, there isn't a way to change
after installation. Somewhere in the documentation symlinking or
mounting is
Ski Kacoroski writes:
Can I just remove older backups by hand then with out mucking it up. I
was concerned because it keeps a record of the older backups in the
backups file. I would like to
rm -rf 102 XferLOG.102.z
You also need to remove the corresponding line from the
backups file
Dirk Erasmus writes:
How can I manually clear out all previous backups in order to start a fresh
pool?
If you want to start *completely* over, remove:
- everything below TOPDIR/pool and TOPDIR/cpool
- remove all the TOPDIR/pc/*/backups files
- remove all the numbered directories
Regis Gras writes:
I asked this question, but without any response, and I have allways
the same problem.
With BackupPC, I can restore some files or some directories, but
I can't restore all the files.
To restore all the files, I have to select a few files or directories at
a time, and to
Samuel Bancal writes:
We're backing up nearly 60 workstations ... and the users are really
enjoying it!
... except one today ... We've got a strange problem :
One user had to restore files on a client and noticed that some files
weren't restored.
This client was backed up through ssh with
Damian O'Hara writes:
I was thinking about the same thing. To be able to see graphically
which servers kicked in at what times and whether the backup window
is coping with the queue list would be great. To colour successful
backups green and failures red would also be useful. If done in html,
Rich Duzenbury writes:
I was fooling around a bit and added an RSS feed for my backuppc
installation. It's not very complete yet, but does have the basics, see
below. Created on 2.1.1. Requires XML::RSS, available from CPAN.
Thanks for submitting this. I'm not verify familiar with
RSS,
Brendan Simon writes:
Could someone tell me what the following errors mean?
Unexpected call
BackupPC::Xfer::RsyncFileIO-unlink(john/aegis/CN.1.5.1.4.C117/images/CN-image.tar.gz)
[ skipped 21 lines ]
Unexpected call
Peter Padberg writes:
is it possible that Backuppc runs
BackupPC_link
even when BackupPC_dump runs?
I have here 40 server for backup.
3-4 server need for 1 backup more than 20-30hours sometime.
In this time Backuppc queues user-requests, much BackupPC_links and
other stuff.
The
Andrew Grieve writes:
Hey, thanks for the patch, worked great! I hope this gets incorporated
into backuppc, as it's quite an easy fix to solve a very annoying problem :)
I still need to understand what version of tar you are using.
GNU tar shouldn't produce these messages. A different
Mirco Ellis writes:
. by the way. If I do an archive using smb to say /tmp, the whole backup
is archived. When I do tar -zvxf
all the data is available and the size (26 GB) corresponds with the actual
size of the data being backed up. I will try and get the info you requested
asap.
about to get on a plane to Europe, so I'll be out
of touch for a while.
Craig
-- Forwarded message --
To: Samuel Bancal [EMAIL PROTECTED]
From: Craig Barratt [EMAIL PROTECTED]
Cc: Jean-Raymond FISCHER [EMAIL PROTECTED]
Date: Wed, 09 Nov 2005 14:39:08 -0800
Subj: Re: Couldnt
Jan Kellermann writes:
i set backuppc to backup two worstations. That works fine. But now it
says The connection was 7 times good... and the backup will run out of
the workingtime but the worstations are turned off at night.
Does Backuppc run the backup when the clients turned on again
Ski writes:
On the server, try going into the directory of one of the files that
appears as a folder in the gui and moving the attrib file to attrib.OLD.
Then look at the files in the gui again. If they show up as files and
not folders, then the patch should have fixed the problem. If
Sam Przyswa writes:
I don't succeed to backup a large /home directory with BackupPC 2.1.1
and XfertMethod 'rsyncd' on Debian sarge, on the BackupPC server log I got:
Is the last full backup an (unsuccessful) partial backup?
It could there is a problem with backing up starting with
a partial.
David Koski writes:
snip log
Unable to read 621621 bytes from /home/backuppc/pc/mail/new//f%2f/RStmp
got=208181, seekPosn=23909067 (1536,253,10003,24117248,24576026)Unable to
read 621621 bytes from /home/backuppc/pc/mail/new//f%2f/RStmp
snip log
The second line is repeated about 125
Carl Wilhelm Soderstrom writes:
Why doesn't anyone like running rsyncd on a windows box standalone?
It's not encrypted. Neither for the transfer, nor for the authentication.
Don't assume that your local network is safe. :)
2. rsyncd can be set to only allow connections from a single
Les Mikesell writes:
On Fri, 2005-12-02 at 10:16, Andy wrote:
I see that the UIDs and GIDs recorded in the XFER log match those from
the directory listing. Great.
I have downloaded and restored the tar archive from the most recent
backup, this time using the -p option to preserve
Paul Fox writes:
i just did a restore of a directory (happily not because of
disaster, but because it was an easy way to get at some files
that live on a machine that's currently offline) and had a big
surprise.
i was accessing an incremental backup tree. since all backups
are filled, i
Craig Barratt writes:
Yes, your explanation is correct. Tar and Smb incrementals are based
only on mtime, so adding/deleting/renaming files, changing other
meta-data, or unpacking an archive with old mtimes won't be
detected.
I didn't mean to include file creation in this list.
Normally
[EMAIL PROTECTED] writes:
Hm. Well I thought I found the problem (I couldn't ssh to [EMAIL PROTECTED]
without getting a password prompt) but after I fixed that the problem
still remained. Ratz!
Here's the restorecmd:
$Conf{TarClientRestoreCmd} = '$sshPath -q -x -l root $host /usr/bin/env
Magnus Larsson writes:
Thanks for pointing me to the typo earlier. Here is another issue that
I don't seem to understand.
My plan is to burn the backus to DVDs. So I set up a host, archivehost, by
creating a file archivehost.pl containing this:
$Conf{XferMethod} = 'archive';
Lachlan Simpson writes:
I am backing up one file server (windows 200 server) via bppc (debian
stable) using rsyncd on cygwin. It's regular cygwin, not the backuppc
version.
The file server has two partitions - the regular C and another
called F. Their respective RsyncShareNames are
Yann writes:
I am using BackupPC v2.1.2
I have backed up a machine during the night...
When I look at the summary of this machine, I have 13 errors
If I click on errors, I can't see any errors
But if I click on JournalXfer, I can see my 13 errors : NT_STATUS_DENIED,
etc .
Trey Nolen writes:
the files are stored in a compressed (not strictly gzip) format; which is
probably throwing the scanner off.
you could always try storing the files uncompressed, by specifying the
following option:
$Conf{CompressLevel} = 0;
try that on the host in question, and
Avi Norowitz writes:
After making these changes, the following is the output:
[EMAIL PROTECTED] bin]$ /usr/local/backuppc/bin/BackupPC_dump -v -f
migration1b.jvds.com
$VAR1 = {};
Exiting because host migration1b.jvds.com does not exist in the hosts file
[EMAIL PROTECTED] bin]$
After
Paul Fox writes:
hi -- i recently realized that there are some pretty big files on
my system that change frequently, and which don't need to be
backed up -- mail index files, for example.
i'd like to be able to flag any file or directory that i want
backuppc to skip by adding a ._nobackup_
Les Mikesell writes:
On Tue, 2005-12-20 at 16:13, Jon Scottorn wrote:
On Tue, 2005-12-20 at 16:10 -0600, Brown, Wade ASL (GE Healthcare)
wrote:
Also, are you rsync'n/copying individual directories
(/backup/cpool, /backup/hosts, etc.)? Or the entire /backup
directory?
Individual
Jamie Myers writes:
I was hoping someone might be able to help me out on this small problem
I have BackupPC up and running and has been backup up a host a few times..
It always fails (I suspect that might be a different problem) but when I
look at the backup later it looks like this:
Gareth Jones writes:
I thought that I'd read somewhere in the mailing list/doc that you needed to
use ? For spaces in files names (which I avoid unless forced by windows!).
Must have got it wrong - as it all appears to work now - though I'm still
getting some permissions problems - but that's
Justin Best writes:
I feel stupid, but I can't figure out where BackupPC gets its host
information for generating links. For my installation, the link BackupPC
sends out in its emails is missing a host name:
--
It is recommended you backup the Outlook files when you are connected
Jamie Myers writes:
I was hoping someone could point me in the right direction to fix this
problem.
I am getting the following error message when I try to hit the cgi app
Can't locate BackupPC/Lib.pm in @INC (@INC contains: /home/backuppc/lib
Look at the first few lines of
Les Mikesell writes:
Do I have to force a full backup if I add a new share to a host?
If you don't, I think the non-rsync methods would only take
files newer than your last full run. Rsync would copy everything
but would repeatedly copy everything until it does a full.
Exactly right.
Justin Best writes:
First, my congratulations to Craig for creating a truly useful and unique
piece of software. I am very impressed.
You sure know how to get your questions answered :).
The problem is that the regexp that matches CgiAdminUsers
isn't robust to special characters in the
Andrew Rice writes:
Ok anyone else come across this error when trying to run the cgi-bin script
from the webpage?
Software error:
Can't locate BackupPC/Lib.pm in @INC (@INC contains:
/home/backuppc/install/lib
/usr/lib/perl5/5.8.5/i386-linux-thread-multi /usr/lib/perl5/5.8.5
Is
Les Mikesell writes:
On Fri, 2006-01-20 at 16:44, Dan Pritts wrote:
On Fri, Jan 20, 2006 at 03:53:33PM -0600, Les Mikesell wrote:
I'd expect to see quite a lot of temp file activity that would
result in changes to unused space on the live system. It would
Yeah, you're right - i
Marko Tukiainen writes:
I tried sending this message to the list yesterday, but seems that I
wasn't subscribed with this exact address, so sorry if this a double
post. Anyway, I've added some further info to this question.
I'm having some problems backing up large files (8GB). By some
Dan Pritts writes:
rsync has a sparse-file mode, but backuppc doesn't enable it.
Is this an omission or intentional?
Both. The server (BackupPC side) doesn't use native rsync.
The code mimics rsync, and not all the features are implemented.
The notable omissions are hardlinks, compression
Edward Alfert writes:
I'm not sure if rsync -H (--hard-link) option is support by BackupPC
2.1.2pl0.
It's not supported in that version.
CVS has the necessary support for hardlinks, but a new version of File::RsyncP
is needed too, which I haven't tested enough to release yet.
Craig
[EMAIL PROTECTED] writes:
One last question - I am only going to set up this for 4 machines.
I would like them to be backed up at night. Is there a way to back
up these 4 at night and the rest during the day?
See the Blackout config settings.
Craig
Brendan Simon writes:
I notice that if I do a full backup the summary page for the host
reports filled=yes, whereas if I do a incremental backup the page
reports filled=no.
Is that correct? The incremental backup succeeds without any errors so
filled should be yes shouldn't it ???
Les'
Justin writes:
Quick question. My status page for the BackupPC server is missing the host
name:
BackupPC Server Status
General Server Information
* The servers PID is 994, on host , version 2.1.1, started at 1/16
15:53.
* This status was generated at 1/26 15:42.
Glenn Tofte writes:
I backed up the share on a Windows 2000 Server via SMB, using BackupPC
(BackupPC-2.1.2) on Fedora Core 4.
I am confused about the Host Backup Summary.
The share that I backed up contains 77,410 files and is 54GB (according
to Windows 2000 server).
On the report
AUF - J r me Santini writes:
Craig Barratt a écrit :
AUF - J r me Santini writes:
I just installed backuppc today. All went fine until I added my nfs
fileserver to the list of host to be backuped
When I try to backup this server, it fail each time on the same user
directory
Raf writes:
Since I'm very interested in trying the new editConfig.pm
module, I've just downloaded BackupPc from CVS.
I'm not a software developer, so this may be
trivial, but I can't run makeDist. I've read the CVS_README
but now hints to resolve the problem.
I report the complete output
.
Thank you very much for your help,
Tomas
No need for additional tests - I'm confident that this fixed
the problem.
Thanks for the feedback.
Craig
Craig Barratt wrote:
Tomas Florian writes:
Yes I definitely have at least one 9 GB file. Is that the problem?
Yes, most
Christophe Faribault writes:
Our backup server (BackupPC 2.1.0) recently crashed so I setup a new server
with BackupPC 2.1.2. It has a LVM of 400Gb (2x200Gb) where are stored cpool,
pc, and pool. The actual install dir is /home/backuppc. I have symb link in
/home/backuppc/data/ for cpool, pc
Guillaume Rousse writes:
I expected an unexplained files wipeout yesterday on my server at around
0:40. When searching my available backups, I had the unhappy surprise of
discovering my backup policy was absolutly not working: I just had a
full backup from 4 month ago, and an incremental
Justin Best writes:
Sorry, I'm a Windows guy by default... how does one apply a patch?
This one is so simple that you could just do it by hand:
- edit the file bin/BackupPC_tarExtract
- look for a line that looks like this:
= 'Z100 A8 A8 A8 A12 A12 A8 A1 Z100 A6 A2 Z32 Z32 A8 A8
Carl Wilhelm Soderstrom writes:
I'm backing up one (last!) win9x box over SMB, and I can't exclude files.
In the per-host config file, I have the following line:
$Conf{BackupFilesExclude} = {
'c' = ['/RECYCLER/*', '/temp/*', '/WUTemp/*', '/WINDOWS/*',
'*/Temporary?Internet?Files/*' ]
Tony Nelson writes:
I'm running BackupPC-2.1.2 quite successfully on my network. Over the past
couple of days something very odd has happened backing up Lotus Notes mail
files on one particular server. Looking at the transfer log I find:
[snip]
create 644 400/401 163053568
Paul Fox writes:
is this patch, submitted some time ago, considered the correct fix
for the excessive messages from incremental tar runs? i just went
hopefully looking for a tar option to suppress that message, but
didn't find one. :-/
There is a patch on SF.net, BackupPC-2.1.2pl0.diff,
Cyril GUILLERMINET writes:
I've got a problem with backuppc using a NAS appliance to store the pool
of data.
My appliance is a snap500, its root is mounted on my server via NFS and
the backuppc's pool of data is stored on it. When i'm starting
backuppc, this appears in my logs :
[EMAIL PROTECTED] writes:
As you can see, this is going pretty slow on my side, but I'll keep making
progress from time to time.
I tried to do the easiest thing first, i.e., switching to using rsyncd,
instead of rsync, and hardcoding the rsyncd modules in the config file for
a particular
Vijay Avarachen writes:
I seem to have backupPC working smoothly but I am having difficulty
narrowing down files that I want to backup. I am using tar to backup my
linux host and I want to only backup files with specific extensions. I know
how to do this via command line
Olivier LAHAYE writes:
As I'm realy interrested in BackupPCd (to backup openned files on windows),
I'd like to test it so I could help debugging.
BackupPCd is under development and I doubt it is ready for
more general use. Roy could advise.
Also, it does not yet have open file suport for
dosseh edj writes:
My BackupPC's server worked well. But last day, i did'nt start the
server due to this message:
$Conf{SendMailPath}='' '/usr/sbin/sendmail'
is not a valid executable program. What would be the problem? Please help.
Your setting of $Conf{SendMailPath} is not
Erik Meitner writes:
Hi. We are running BackupPC V2.11(Debian 2.1.1-2sarge1). The BackupPc
pool is on a 600 GB ext3 partition. For one of our users who has a
fairly deep directory structure we get a lot of unable to link errors
(see end of message). The files in the pool are not at the
ROBERTO MORENO writes:
I have been using Backuppc for a while and everything is great
but last time i check for my job the backup numbers were mission
on the web front end. On the back end everything is still there.
For some reason the old backups start at 11, 12, 13 and so on.
The new
Pierre Hourdebaigt writes:
I am a new backuppc user (5 days ago). I use it to backup XP clients,
Fedora (FC4) and Mandriva (2006) clients. The server is on debian Sarge 3.1.
I can backup and restore without problem.
But I have noticed that 'BackupFilesExclude' parameter doesn't run for
my
David Brown writes:
I've been using backuppc for several days, and I really like the concept
behind it. The web interface is very helpful. However, I'm having a very
hard time figuring out what to store the backup filesystem on.
I've tried both XFS and ReiserFS, and both have utterly
Erik Meitner writes:
Craig Barratt wrote:
Erik Meitner writes:
Hi. We are running BackupPC V2.11(Debian 2.1.1-2sarge1). The BackupPc
pool is on a 600 GB ext3 partition. For one of our users who has a
fairly deep directory structure we get a lot of unable to link errors
(see
Winston Nolan writes:
i have this error for two of my linux boxes, what can i do to fix this?
error:
2006-02-21 10:14:36 intranet.lka.co.za: Use of uninitialized value in
chdir at /usr/lib/perl5/5.8.6/File/Find.pm line 741.
2006-02-21 10:14:36 intranet.lka.co.za: Use of chdir('') or
Khaled Hussain writes:
1. What is meant by pool exactly? Is this referring to all previous backups?
Is this reffering to files that are common between computers?
A single copy of every file is stored in the pool, whether or
not it appears multiple times among the backups.
2. I have seen on
Justin Best writes:
When Outlook is running, the PST file is locked and the backup fails
I've been going with the assumption that this error was due to the
apostrophe in the PST file name, since I *thought* the error disappeared
when I renamed Madalyn's Personal Folders.pst to Madalyns
Travis Wu writes:
I want to have backuppc run hourly so I last night I configured
$Conf{WakeupSchedule} = [1..23];
$Conf{IncrPeriod} = 0.04;
since 1/24=0.042. However, the backup summary shows:
4 incr no2/20 23:00
5 incr no2/21 01:03
6
Travis Wu writes:
Just double checked again.
the setting looks like below:
pc/myhost/config.pl
$Conf{BlackoutGoodCnt} = 0;
conf/config.pl
$Conf{BlackoutPeriods} = [
{
hourBegin = 7.0,
hourEnd = 19.5,
weekDays = [1, 2, 3, 4, 5],
},
];
Jean-Yves F. Barbier writes:
All machines have a Debian Sarge installed.
I found that this machine's groups doesn't match the server's one for
the /NFS (but ugidd is installed on the NFS server, which also does
BackupPC)
Strangely, only the mirror, which is on /NFS/1/MIRROR (and belongs
Jean-Yves F. Barbier writes:
Ooops, sorry Craig, I just delete the directories.
I'll launch new backups tomorrow afternoon and keep you aware about
results.
Ok. If you are using an automounter, another theory is that the
exclude actually isn't working, and on most systems the file system
Jean-Yves F. Barbier writes:
All machines automount the /NFS shares @ boot :(
All machines have an existing /NFS mounted @ the backup time,
and only one includes it in its backupc.
The only difference between machine I don't understand is the impossibility
to umount the /NFS/x on the
Tristan Krakau writes:
due to the rsync-cygwin-ssh problem when backing up Windows clients
using 'rsync' as transfer, I tried to use 'rsyncd' through a ssh tunnel
instead (since the Windows client can only be accessed via ssh).
I found other threads dealing with this topic (e.g.
Rodrigo Real writes:
I am having a problem with a host that must be backed-up with
backuppc. This host runs an ssh server on the port 222, so I changed
the variable $Conf{RsyncClientCmd} to meet this requirement, but when
I try to run the backup on this host, I receive the error: Unable to
1 - 100 of 1316 matches
Mail list logo