Alex writes:
The folder shows up in the backup as 0750. The -p is present
Do you mean when you look at the directory permissions below
the PC directory on the backup server; eg, the output from:
ls -ld /TOPDIR/pc/HOST/nnn/fshare/fhome
What permissions are shown when you browse to that
Alex writes:
[r...@qsbackup f%2f]# pwd
/opt/backuppc/files/pc/mail/184/f%2f
[r...@qsbackup f%2f]# ll
total 16
-rw-r- 3 backuppc backuppc 26 Apr 17 05:04 attrib
drwxr-x--- 5 backuppc backuppc 4096 Apr 17 06:00 fetc
drwxr-x--- 3 backuppc backuppc 4096 Apr 17 06:03 fhome
drwxr-x---
John writes:
$Conf{SmbShareName} = [
'C$'
];
#FILES TO BACKUP
#-
$Conf{BackupFilesOnly} = {
'c' = ['/MS_OUTLOOK/*'],
};
First, the 'c' should be 'C$' - it should match the share name.
Also, you can't use wildcards in
Tim writes:
This is a new install so I thought I would try the beta version
do you recommend I go back to the stable version?
If you are willing to test the beta version some more that
would be great. You've already found one bug :).
Holger told you where to get the File::Listing module.
Holger writes:
two things are really confusing me:
1.) The title claims that it is supposed to be an *rsync* xfer, the error
message clearly indicates that *ftp* is attempted (and fails). Tim, could
you please clarify which transfer method you are trying to use?
The code loads all
Obj writes:
I am running version 3.2.0. can someone tell me why
$Conf{BackupFilesExclude} is not working. It still backups up all Temp
folders, and .mp3 files, etc. The backup method is SMB.
You sent me offlist your config file and XferLOG file. Thanks.
The problem is that if you use
Fatih writes:
I want to translate BackupPc's CGI and the installation part from
English to Turkish.
How can i do this ? Is there anyone who is responsible for this kind
work ? Who can me give some advice where i could begin start
You should look in lib/BackupPC/Lang. Each language has its
Tim writes:
Hi I just installed the latest backuppc
version 3.2.0beta0. When I try to do
a full backup of a test host I'm seeing
this error in the log
2009-04-12 11:07:14 User backuppc requested backup of scvffs09 (scvffs09)
Can't locate File/Listing.pm in @INC (@INC contains:
Boniforti writes:
carola/Desktop/CAROLA/Mise à jour des prix 2009-04-01.xls: size doesn't
match (14702080 vs 0)
Can you tell me why it should tell things like size doesn't match?
Could you please explain what's going on?
You have a high log level enabled ($Conf{XferLogLevel}), so you
Chris writes:
I didn't see any mention of lib/BackupPC/Lib.pm being updated for the
case when XFS is used as the pool file system and IO::Dirent is
installed (as per
http://www.mail-archive.com/backuppc-de...@lists.sourceforge.net/msg00195.html).
Looking at the source of the Beta, this
Bharat writes:
On the host page - I selected SMB with sharename set to D$
I've setup my username and password (administrator)
I've unticked BackupFilesExclude and Ticked BackupFilesExclude and tried
various formats
(including */Temp/* as found in EXCLUDE!!! - yes there is a Temp folder on
John,
It's still on my todo list - I didn't get around to it for 3.2.0beta0.
I'll see if I can get it in before the final 3.2.0 release.
Craig
--
This SF.net email is sponsored by:
High Quality Requirements in a
BackupPC 3.2.0beta0 has been released on SF.net.
3.2.0beta0 is the first beta release of 3.2.0.
3.2.0beta0 has several new features and quite a few bug fixes
since 3.1.0. New features include:
* Added FTP xfer method, implemented by Paul Mantz.
* Added more options to server backup command:
Madcha writes:
Since few days, trashClean is started but, won't clean old backups,
I don't understand why,
In trash folder there nothing,
That means BackupPC_trashClean is working: its job is to remove
everything that appears in $TopDir/trash.
It is perhaps for that reason that it does
Mirco,
Here error log, mailed from BackupPC:
The following hosts had an error that is probably caused by a
misconfiguration. Please fix these hosts:
- elpra01lc (Call timed out: server did not respond after 2
milliseconds opening remote file \ELPRA01WS\ELPRA06SV\E
Obj writes:
I am running version 3.2.0.
Actually you are running CVS.
can someone tell me why
$Conf{BackupFilesExclude} is not working. It still backups up all Temp
folders, and .mp3 files, etc. The backup method is SMB.
Can you send the first few lines for the XferLOG file?
Craig
Pedro writes:
Those are good news, where can we see about new stuff is in this upgrade?
Here is the current ChangeLog. This should be pretty much what is
in 3.2.0beta0.
Craig
* Added BackupPC::Xfer::Protocol as a common class for each Xfer
method. This simplifies some of the xfer specific
Tomasz writes:
Are there any plans to update File-RsyncP to make it compatible with
newer rsync protocol versions?
I'm experimenting with FUSE to see if native rsync3 + FUSE will
be the best path. Otherwise, yes, I will update File-RsyncP.
Craig
David writes:
The short story is that you need to configure BackupPC to wake up only
once per day, in order for wakeonlan to work in a reasonable manner.
That should be fixed in 3.2.0beta0:
* Moved call to NmbLookupFindHostCmd in BackupPC_dump to after the
check of whether a backup
Jeff writes:
Sounds cool... I imagine this is in line with the thread we had a few
months ago.
Yes, that's right. I want to be sure the performance and reliability
are high enough before making the decision.
Craig
John,
I am seeing corrupted directory listings using BackupPC_tarCreate. One
of the reported filenames has a bunch of nulls in the middle of it
using BackupPC-3.1.0.
I'd like to get to the bottom of this. Let's take this off list.
It would be great if you could get this to happen on as
J:
Is it possible to get a CVS copy?
I tried: cvs -z3.2
-d:pserver:anonym...@backuppc.cvs.sourceforge.net:/cvsroot/backuppc co
BackupPC
...but received the dreaded __CONFIGURE_BIN_LIST__ error when I ran
the ./configure.pl
You need to read CVS_README (actually I need to update this
Les writes:
I would absolutely love it if the top level directories were still
created by backuppc first before doing the hardlink test. If those
directories are created because they dont exist and the hardlink
test fails then just remove the directories. Or leave them there,
all you've done
Paul writes:
Here's what I'm getting:
full backup started for directory /etc (baseline backup #50)
Running: /usr/bin/rsync --server --sender --numeric-ids --perms --owner
--group -D --links --hard-links --times --block-size=2048 --recursive
--filter dir-merge\\\ /.backuppc-filter
Xavier writes:
*) BackupPC doesn't worked correctly on one host
#86 was supposed to be a full backup but when browsing I found out that it's
missing a lot of directory ( /bin /home ...)
size of backup# on disk
2,8G 86
9,3G 87
9,4G 88
Moreover, when trying to read logfile, to found
Paul writes:
I tried just changing 'RsyncClientCmd' to $rsyncPath $argList+ but it
seems BackupPC is expecting the SSH and is now improperly escaping
'RsyncArgs'. The hitch is with a space in one.
$Conf{RsyncClientCmd} = '$rsyncPath $argList+';
$Conf{RsyncArgs} = [
'--numeric-ids',
Pramathesh writes:
The documentation on the backuppc mentions that old unmangled file
names are still supported by the CGI interace. However, I have not been
able to figure out how and where this option can be set.
What that means is backups taken with very old versions of BackupPC
(when file
John writes:
Can anybody confirm that xferlogs are not being written if
DumpPreUserCmd exits non-zero with $Conf{UserCmdCheckStatus} = 1? Also
does anybody know if it is fixed in a subsequent release?
Yes, this looks like a bug. An error will be written to the per-client
LOG file. But in
Ski writes:
I have a windows client that has been working fine for over a year and
now there are three files in the 6 - 7GB range that it just ignores. I
am using cygwin-rsyncd 2.6.6 and backuppc 2.1.2. I was able to force a
backup of one large file by excluding all other directories except
sabujp writes:
In the last command that runs BackupPC_tarPCCopy, does this perl command look
at any of the configuration files on the local host or does it just get what
it needs to re-generate the hard links straight from the old pc directory?
I looked through the code and don't see that
Matthias writes:
I backup a windows client with rsyncd over ssh. I am pretty sure the ssh
connection was interrupted at 23:27.
In the /var/lib/backuppc/pc/st-ms-wv/XferLOG.0.z I found the error message:
create 770 4294967295/4294967295 240986
Help/Windows/de-DE/artcone.h1s
Read
Paul writes:
In case this is of use to others, I tweaked the BackupPC_archiveStart
script to properly (IMHO) deal with the ArchiveComp setting. While my
coding style may be icky to some, I think my removal of the .raw file
extension for uncompressed archive files may be of issue to others.
David writes:
I took a closer look at the perl code and I see the cause of the problem.
Please note I have no DNS. My PCs use DHCP, but are configured in BackupPC
with the host table's DHCP flag set to zero.
Here is what I think is happening:
1. BackupPC_dump is called periodically at
James writes:
I wanted to be able to notify a group of people when any backup starts
or ends, so I did some googling and found an archived email on this
list about how I might do it.
I made a script with the following contents as a test and named
it startbkpemail.sh (just used the example
BackupCentral.com has generously offered to contribute some free
banner ads for BackupPC on their site.
To take advantage of this offer I need someone with some graphic
skills to generate a couple of banner images with particular
geometries. If you are willing to contribute some time please
Tony writes:
I missed the original post, but I run rsync with the --whole-file
option, but I still get RStmp files, is that not supposed to happen?
RStmp is a temporary file used to store the uncompressed pool file,
which is needed for the rsync algorithm. It's only used for larger
files -
Cody writes:
I'd be willing to do a lot of the cleaning myself, though I don't want
to step on anyone's toes without talking with you first. Also, my
knowledge of BackupPC is fairly limited to my setup (XP/Vista clients
Ubuntu server).
I agree it isn't very well organized. I don't think
Nick writes:
Ive tried putting ; in between the two commands, ive tried and
as well, with in there, it seems to run both commands but the
variables arent being pulled from backuppc, so the email doesnt work
correctly. and also the script that runs my vshadow commands doesnt
seem to be
Brian writes:
* 0 pending backup requests from last scheduled wakeup,
* 0 pending user backup requests,
* 0 pending command requests,
* Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29 01:00),
* Pool hashing gives 0 repeated files with longest chain 0,
*
Jean-Michel writes:
$Conf{BackupFilesExclude} = [
'/Users/garant/Library/Preferences/ByHost/*00224126372e.plist' ];
notice the wildcard '*' in the file list...
but it seems that BackupPC_dump stats the file BEFORE to exclude the
file from backup because there is a failed to open
Christian writes:
I'm having some issues with excluding directories.
If have the following settings in the host.pl:
=snip===
$Conf{RsyncShareName} = [
'/',
'/srv'
];
$Conf{BackupFilesExclude} = {
'srv' = [
'file1',
Jeff writes:
Are you sure that you can't get rsync to calculate the checksums (both
block and full-file) before file transfer begins -- I don't know I'm
just asking..
I believe rsync's --checksum option precomutes and sends the whole
file checksum (which as has been noted is different to
Sil writes:
$Conf{ArchiveClientCmd} = '$Installdir/bin/BackupPC_archiveHost' =
add -b 10
. ' $tarCreatePath $splitpath $parpath $host $backupnumber'
. ' $compression $compext $splitsize $archiveloc $parfile *';
I don't know how to write this, and where to place it ?
Did
Matthias writes:
If a user requests a restore I want to restore one extra file and handle it
by the RestorePostUserCmd.
Is it possible to request this additional restore with BackupPC_restore
during the RestorePreUserCmd or RestorePostUserCmd ?
Yes, you could do it by emulating what the CGI
Simone writes:
I got a strange problem doing incrementals with tar over ssh using
--newer=$incrDate+. It seems an escape problem of part of the time
reference for the incremental.
Yes, the escaping isn't happening. The $incrDate+ form means
to escape the value, so that is what you should use
Omar writes:
$Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f -
-C $shareName+'
. ' --totals';
$Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p
--numeric-owner --same-owner'
. ' -v -f - -C $shareName+';
Gilles writes:
tree connect failed: NT_STATUS_ACCESS_DENIED
David Kahn reports that this happens with recent versions
of smbclient. Removing the -N option fixes it.
Can you confirm this fix works for you?
Craig
-- Forwarded message --
To:
Sean writes:
I have tried to do a full backup of a Windows XP PC. the Backup is
successful. Although I get the error ?No files dumped for share. What
is wrong?
The backup isn't successful (since no files were dumped for one (or more)
shares).
Please look at the XferLOG.bad file (which should
Kiran writes:
I am trying to install BackupPC on ubuntu server edition. I am running the
confiure command as
sudo perl configure.pl
it fails with the error message
Making init.d scripts
can't chown 1000, 1000 init.d/gentoo-backuppc.conf at configure.pl line 1011.
Not sure where the
Christian writes:
2009-01-02 19:56:54 User admin requested backup of ip (ip)
2009-01-02 19:56:55 Started full backup on ip (pid=26716, share=/)
2009-01-02 19:56:56 Backup failed on ip (fileListReceive failed)
The most common cause is extraneous output from the client-side ssh
or shell before
Jeff Kosowsky writes:
I add an (empty) file named '-i' in several of my key Linux
directories to prevent inadvertent rm * calamities.
However, BackupPC doesn't seem to like this, giving me error messages
of form:
Can't open /var/lib/BackupPC//pc/mycomputer/new/f%2fetc/f-i for empty
Jeff Kosowsky writes:
I had been thinking of writing code to implement a robust fuse
filesystem for BackupPC backups but then I saw that John Craig (and
perhaps others) had started to write code.
While the code still seems to be at the proof-of-concept I think the idea
is very powerful and
Pedro writes:
After searching for a while and doing some digging i found that i had
files that would cause ssh to exit, usually you can exit ssh with ~.
and in fact i had files with that name and content.
what i did on backuppc config page was:
on the main configuration editor, xfer:
Jeff Kosowsky writes:
Actually, after doing some subsequent incrementals, I believe it is a
'bug' and not a feature.
Yes, definitely a bug. Here's a patch, which will shortly
be in CVS.
Craig
--- bin/BackupPC_dump.orig 2008-12-29 02:09:04.643105800 -0800
+++ bin/BackupPC_dump 2008-12-29
Chris writes:
Don't do a direct restore. Download a Zip or Tar archive.
Or cancel the incremental backup.
Craig
--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web
James writes:
I have the following config line:
$Conf{BackupFilesExclude} = ['/proc', '/mnt', '/sys', '/home/
users', ... , '+ /vz/dump', '/vz/*'];
But there is no /vz/dump in the backups. What am I doing wrong?
You can't use the rsync syntax (+/-) in $Conf{BackupFilesExclude}.
You can
Mark writes:
Just noticed the /var/log/backuppc/LOG file for BackupPC is pumping
these out mercilessly:
2008-12-10 06:49:02 BackupPC_link got error -4 when calling
MakeFileLink(/mnt/backup/pc/shuttle
Cesar,
Are you sure this is 3.0 and not 3.1?
In 3.1 an optimization was added to use IO::Dirent for reading
the inodes in a directory, which on certain filesystems doesn't
work correctly. If you are running 3.1.0 I would recommend
trying to disable IO::Dirent by changing this line:
Jeffrey writes:
Just as an FYI, it is possible to use perl code within config files so
that you can use a single config file yet still customize
configurations by pc (or groups of pc's) without having to duplicate
changes across multiple relatively similar config files each time you
change a
Jeffrey writes:
$Conf{WakeupSchedule} = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
14, 15, 16, 17, 18, 19, 20, 21, 22, 23];
Is there any reason midnight is left off?
Mostly laziness: I never got around to complete testing the 0 case.
It probably does work, but there are various cases that
Jeffrey,
Sorry about the delay in replying. I've been really busy lately.
1. If a directory is *empty*, is there any reason for it to have an
attrib file?
Because in playing around with creating and deleting directory
contents, I
found that sometimes even after
Jeffrey writes:
For the part of my routines BackupPC_fixLinks and BackupPC_deleteFile
that actually try to make new links (or delete old ones), I would like
to make sure that nothing else is creating or deleting links to the
pool to avoid collisions.
Specifically, I would like to be able
Tino writes:
The excludes are specified per share. So it should read:
$Conf{BackupFilesExclude} = {
'/' = [ '/sys/', '/vz/root/', ... ]
};
('/' being your share name here if you use rsync via ssh.)
Yes, you're right.
Also, James, in 3.x the excludes are not passed in the
command-line
Sam writes:
A copy of BackupPC 3.1 has been obtained from the main Ubuntu
repository and successfully installed on v 8.04-Server. This has
now provided access to the file BackupPC_archiveStart which I
intend to use via cron as outlined in the BackupPC documentation.
When executed from the
James writes:
The problem we are seeing is that Backups are randomly failing.
The log file on BackupPC showing something like this:
This is most likely a TCP timeout or other network problem.
Rsync added a TCP keep-alive option in protocol version 29
(if I recall correctly) and is not
SamK writes:
I am struggling to create an archive from the command line. The desired
outcome is to emulate the clicking of the Start Archive button in the web
page as this method is working perfectly. The overall objective is to create
the archive as a cron job.
From
Samk writes:
I was hoping to find a solution which prevents the user from starting
a backup but allows for a restore whenever required.
There isn't a configuration option that allows this.
I'd recommend just editing lib/BackupPC/Lang/en.pm (assuming you
are using english), and remove these
Jeffrey writes:
Looking at the code and the structure of the storage and attrib files,
it doesn't seem like there is any way for BackupPC to record and
restore hard links.
Not true. Hardlinks are stored without using hardlinks.
Hardlinks are stored just like symlinks. The attribute type is
Rob writes:
I'm having a problem trying to restore files to my clients from my
backuppc server. I am using rsync over ssh. The backups work great, but
when I try to restore a file (I've just been trying to restore /etc/hosts
to test the setup), it just hangs. I flushed my iptables, so it's
Jeffrey writes:
I have been considering the following:
- Uncompressing the full file to determine its length..
But this is very computationally inefficient for large files...
Right.
- Unpacking attrib file but this seems
This seems best, but I'm not sure what are the
Mark writes:
Using rsync between two linux servers, the full took 2.5 hours, the
incremental backups are taking longer each day.
2008-10-21 23:00:01 full backup started for directory /
2008-10-22 02:32:55 full backup 0 complete, 294903 files, 203401100538
bytes, 28 xferErrs (0 bad files, 0
Jeff writes:
Is there a (reasonably easy) way of identifying which ones have the
rsync checksum seed and which ones don't???
I'm relucant to even say, because you are heading in an unproductive
direction. But here goes: a compressed file without checksums starts
with 0x78 and a compressed
Jeffrey writes:
So what does -4 mean and what can cause it?
Fails to make a hardlink. Several possible reasons: you are out
of inodes, your cpool and and pc directory are on different file
systems, your BackupPC file system doesn't support hardlinks, or
you have a permissions problem of some
Jeffrey writes:
Except that it my case some of the duplicated checksums truly are the
same file (probably due to the link issue I am having)...
Yes. Just as Holger mentions, if the hardlink attempt fails,
a new file is created in the pool. You appear to have some
unreliability in your NFS or
Jeffrey writes:
Types of Duplicate checksums:
1. Same checksum but contents differ -- INTENTIONAL - nothing to fix
Right.
2. Same checksum and compressed content
I have found many of these but contrary to my earlier postings
the ones that I examined were not
Jeffrey writes:
I have:
$Conf{BackupFilesExclude} = [
'/Documents and Settings/*/LocalSettings/Temp/*',
[snip]
You have two sets of quotes here, so this is excluding this path:
/Documents and Settings/*/LocalSettings/Temp/*
instead of
/Documents and Settings/*/LocalSettings/Temp/*
Jeffrey writes:
Can't call method isCached on an undefined value at
/usr/share/BackupPC/lib/BackupPC/Xfer/RsyncFileIO.pm line 165.
That isn't good. This is the case where it is doing a random check
of the cached checksums (based on $Conf{RsyncCsumCacheVerifyProb}).
I am missing an error
Jeffrey writes:
So, ideally, I am looking for a hook to run in advance of any web or
backup operation that needs access to /var/lib/BackupPC.
Does such a hook exist?
If not where would be the best place in the code to hook into?
No, there isn't such a hook - BackupPC assumes the pool
Holger writes:
So if someone with the power to unsubscribe
him reads this, please do. Thank you.
Done.
Craig
-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based
Yuriy writes:
What does your browser say what the page encoding is?
ISO-8859-1.
This suggests you are running BackupPC 2.x. Support for
$Conf{ClientCharset} was added in 3.x. In 3.x all the
server-side and CGI encodings are utf8.
Craig
Yazz writes:
I may be able to just do it with $DumpPostUserCmd but I haven't tested
that yet. I just think it would be nicer to have it as a built in option.
/bin/ln -sf $topDir/pc/$host/$(/bin/cat $topDir/pc/$host/backups \
| /bin/grep full | /bin/sort -n | /usr/bin/tail -1 \
| /bin/awk
Fernando writes:
I'm replying to this old thread because I believe that my question/commentary
fit here better than in any other thread regarding this subject.
I'm using BackupPC to backup among others a samba file server that uses a
ISO8859-1 charset. To get the characters displayed
Ski writes:
Running Backuppc 2.1.2 on a debian linux server and backing up a linux
machine, I ran into a problem backing up a file where the entire path
length was 318 characters. Once I shrunk the path length it worked
perfectly. Has this been fixed in version 3.1? If not, could you add
Stephen writes:
When I run BackupPC_archiveStart from the command line it's not
using compression for the backup, the backup file is created using
raw, however if I start the archive job via the web-interface it
uses compression (gzip by default).
The command I am running is:
Stephen writes:
hmm okay. That sort of worked, I did the first change now I'm getting an
error for gzip:
Set it to the full path, eg: /bin/gzip.
Craig
-
This SF.Net email is sponsored by the Moblin Your Move Developer's
Stephen writes:
Still seems to be creating a .raw archive:
Yes. The archive is now compressed, in spite of the extension:
/bin/csh -cf /usr/share/backuppc/bin/BackupPC_tarCreate -t -h localhost -n
24 -s \* . | /bin/gzip /data/NAS-Mount-MaxBackups/offsite/localhost.24.tar.raw
The
Mark writes:
I am using backuppc to ssh to a remote host, and use rsync for the
backups. Before the backup, I have backuppc run a script on the remote
host to manipulate some database files. It appears that the script is
excuted, but I keep getting a message in the backuppc logs that the
Marcel writes:
- localhost (DumpPostShareCmd returned error status 512)
Look in the XferLOG.bad file for localhost and see what command
was executed.
Craig
-
This SF.Net email is sponsored by the Moblin Your Move
Aleksey writes:
Hi. What do these messages in my LOG files, repeating over and over
again, mean? Should I be concerned?
2008-09-25 23:13:07 Botch on admin job for admin : already in use!!
2008-09-25 23:13:07 Botch on admin job for admin1 : already in use!!
2008-09-25 23:18:07 Botch on
Philippe writes:
I went to the sourceforge site and noticed that the last update of ths
cvs was 9 month old. Does it mean that developpement stopped or that the
next version is not on this server ?
I don't use the SF CVS site very actively - I tend to do batch updates
closer to the release
Bruno writes:
I double check my host.pl file multiple times but still could not find
anything wrong with it. So after my Google searches, I try putting single
quotes around the values instead of the double quotes. Like this:
$Conf{XferMethod} = 'rsyncd';
$Conf{RsyncdUserName} = 'user';
Louis-Marie writes:
I also have a remark about backuppc pooling feature: I think the server
locally detects file duplicates by hashing them after download. As far
as I know, rsync should also be able to send some kind of hash from
remote host before download. Wouldn't it be possible to detect
Hendrik writes:
Furthermore: Would it be possible to limit the BackupPC_tarPCCopy command to
one host only?
Yes this should work, eg;
BackupPC_tarPCCopy /var/lib/backuppc/pc/HOST | (cd /new/backuppc/pc tar
xPf -)
Note: the extract still starts at the pc directory, not pc/HOST.
Using
James writes:
Manual full and incremental backups work. No backups are running
automatically. I'm using the default schedule options, I changed the
default blackout period to a negative value for testing purposes.
I'm using rsyncd with Windows XP. See attached config.pl
I don't see a
John writes:
We were having a weird problem with nmblookup firing for a host that
was disabled with:
$Conf{BackupsDisable} = '2';
using BackupPC 3.1.0. The same thing happend when set to 1. My claim
is that BackupPC shouldn't be attempting to resolve any host with
BackupsDisabled set
Andrew writes:
A few days ago I noticed that none of my hosts are backing up. All but
two give the error, no ping (ping too slow: 38.94msec (threshold is
35msec)) -- or some similar ping.
One such host is named shipping in backuppc. The thing is, I can ping
from the BackupPC server with no
Terri writes:
Where is that stored? Reason I asked, I was reading over the archives
on how to backup different hosts with different directories and found
a post on putting a config.pl in the /pc/host directory. I then
decided to add second path on that same host and used the edit config
for
Holger writes:
I also read this as all of / except /proc, /sys, /mnt, /opt, plus
/opt/zimbra/backup. I would implement this like BackupPC does, though it's
perhaps not intuitive :-). If you are using rsync(d), that is. It won't work
with tar or smb.
Excellent explanation.
Craig
James writes:
I need to back up a Zimbra mail server, but I can't get the whole thing in
any reasonable amount of time. I want to exclude the usual stuff:
$Conf{BackupFilesExclude} = ['/proc', '/sys', '/mnt'];
But I also want to exclude all of /opt, except /opt/zimbra/backup.
Is this
James writes:
I'm trying to back up some Arm processor console servers which only
have busybox tar available. Busybox tar does not support --totals
and I THINK this is why the backups are failing. I tried writing a
wrapper script that spits out a bogus --totals line, but so far, no
luck.
601 - 700 of 1316 matches
Mail list logo