komodo writes:
bash-3.1$ /usr/local/BackupPC/bin/BackupPC_serverMesg status host
mail.test.com
Got reply: ok
But in the log
2007-09-04 16:07:09 Unknown status request host
2007-09-04 16:07:09 Unknown status request mail.test.com
So where is pls the problem. I need this feature.
The
Dmitri writes:
Where should I look next?
Look at the XferLOG file for the exact command that is being
executed, and the resulting error.
Craig
-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log
BackupPC 3.1.0beta0 has been released on SF.net.
3.1.0beta0 is the first beta release of 3.1.0.
3.1.0beta0 has several new features and bug fixes since 3.0.0.
New features include:
* Added new script BackupPC_archiveStart that allows command-line
starting of archives.
* Added Simplified
Nilesh writes:
I think that can be run under cygwin install on windows.
I have not tried but lots of other software are working.
Yes, BackupPC does run under cygwin on WinXP. However, I doubt the
performance is that great. It is certainly convenient for me as a
development environment when
Alessandro writes:
i'm having trouble with BackupPC 2.1.1
after months of awesome work, the system will rename my backups file
in the pc directory to backups.old.
Doing so, i'm unable to restore or access to the old backups.
How can I avoid this? It's a bug or a feature?
BackupPC renames
Martin writes:
This looks a lot like an access rights problem, but it shouldn't
be. I have the GUI running on mod_perl and Apache running as user
backuppc. My data directory is also owned by user backuppc, and
BackupPC is configured to run as the same user.
Your instincts are right: this
Joe writes:
The way the precopy script works is that it dumps all the database to a
directory in the /home partition. This directory is then backedup.
So if the precopy script launches for /home, finishes
tar starts for /home
precopy for /srv/www starts in parallel with tar /home
That is
Nils writes:
You can set $Conf{FullPeriod} to -1 or -2 for this host. See http://
backuppc.sourceforge.net/faq/BackupPC.html#other_installation_topics.
That's right.
One comment: in 3.x there is a new setting $Conf{BackupsDisable}
for disabling backups - set it to 1 or 2. That saves
Nok writes:
I am trying to make a archive backup to a tape o directory but, when I
click in start, I get this error:
Error: Can't open/create ${EscHTML($TopDir/pc/$hostDest/$reqFileName)}
Looks like the error message isn't getting interpolated.
Either you have a permissions problem, or
Lee writes:
I get this error when trying to restore a file.
Can't get rsync digests from
/mnt/data/backuppc/pc/swbmail/174/fExchange/fTuesday-Exchange.bkf
I'm concerned that means the file is corrupted in some manner - it
appears to be shorter than it should be (the rsync digests are stored
Joe writes:
No, there is no other process writing these files.
I launch the precopy script at prompt and it doesn't return control
until the whole process is finished, so no backgrounding there.
The only idea I can think of is:
- precopy is launched for /home and finishes
- tar backup
Mark writes:
Dear craig,
I'm at a dead end. I googled for more than 5 days: smbclient does not show
all the files, but only approx 31-36 files per directory.
I'm in a PDC-environment (SAMBA is Primary domain controler). About 10 W2K
pc with roaming profiles should be backed up each day.
Youlin writes:
Another problem I saw is related to the rsyncd module names or Samba
share names in non-European languages, at least Chinese. The modules
or shares are backed up ok. But when the restore is to be performed
thru the CGI, the restore option page would contain bad module/share
Samuel writes:
I have seen this before on the list and maybe I am misunderstanding
something but this is my question:
I am trying to backup Windows Spanish version with rsyncd transfer
method, after configuring codepage as cp1252 the file names at the web
interface reads ok. (BTW if you
Youlin writes:
The real problem is when trying to restore a file, with the tar and zip
methods the filename comes also garbled.
I have a similar problem with the Chinese support. Files backed up
with rsyncd would yield garbled file names when they are restored
through Samba, or vice
Joe writes:
I have BackupPC 3.0 configured and working but I sometimes get errors
which I think are due to the DumpPreUserCmd overlapping with tar.
I have:
$Conf{TarShareName} = [
'/home',
'/srv/www',
'/usr/local'
];
$Conf{DumpPreUserCmd} = '/home/cpseg/scripts/precopy';
Yaakov writes:
Fatal error (bad version): Exec failed for sudo /usr/bin/rsync
You need to use the full local path for sudo.
Craig
-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find
Rob writes:
So does the admin summary only happen on errors, too?
Yes.
Is there a way to make BackupPC send an email on completion??
You can use $Conf{DumpPostUserCmd}; see the archives.
Craig
-
This SF.net email is
Rob writes:
I've installed BackupPC and it's working pretty well so far with RSync
(using the rsyncd program).
However, it hasn't as of yet sent me an email letting me know the job is
done / not done / etc.
Any ideas? (everything but the domain name is real). Just ran an
Arch writes:
If I backup one drive of a computer, all is A-OK. If I try to backup
a computer with multiple drives, I get an error saying tree connect
failed: NT_STATUS_BAD_NETWORK_NAME. I tried separating them like
this 'C$, D$, W$'.
You need to use:
$Conf{SmbShareName} = [ 'C$', 'D$',
Bruno writes:
I have installed backuppc 2.1.2pl1 on debian etch.
Currently i backup 7 servers, on 5 servers all works fine but:
On two server i got the following xferlog:
tarExtract: .
tarExtract: : checksum error at
tarExtract: Can't open
Peter writes:
o Pool is 2.88GB comprising 17366 files and 4359 directories (as of
7/4 01:01),
[snip]
There are 3 hosts that have been backed up, for a total of:
* 6 full backups of total size 11.84GB (prior to pooling and compression),
* 18 incr backups of total size 1.72GB (prior to
James writes:
What version of tar is this?
Tomm:/Users/user root# tar --version
tar (GNU tar) 1.14 +CVE-2006-0300 +CVE-2006-6097
If it is not standard gnu tar, is
there source available to look at? How big is the winxp.hdd
file?
-rw-r--r--1 user user 8731151360 Jul 2
Matthias writes:
Am Sonntag 01 Juli 2007 18:51 schrieb mna.news:
Le dimanche 1 juillet 2007 15:58, Matthias Meyer a écrit :
Hello,
It is possible to delete a backup? In the documewntation I only find
how to delete all backups from one host.
But I want to delete the last backup
ilias writes:
I've managed to set up backuppc and make backups from windows pcs. I got one
problem though. Some of my files got Greek filenames which cannot be read
properly when browsing backups from the WEB GUI of backuppc, however when I
restore the files, everything is restored properly.
Thomas writes:
I am using BackupPC 2.1.2 on a Debian (Etch) server. BackupPC is
configured to use tar via ssh to backup a /home-dir (around 80 GB).
During or after (I don't know) executing a job I got follwing message
(BackupPC was started manually with 'BackupPC_dump -v -f 192.168.0.5'):
James writes:
Any ideas on this? This Mac was backing up fine, but recently I'm
getting lines like this in the log:
/usr/bin/tar: ./Users/user/Documents/Parallels/Microsoft Windows XP/
winxp.hdd: file changed as we read it
I assume that is a very large file. It appears tar doesn't
Norbert writes:
I created a new share on an existing web server that I am backing up. The
host configuration file was called '1and1-MW-common.pl', consistent with
the case of the directory that I was backing up. I had added
'1and1-MW-common 0 user' to the backuppc 'hosts' file. The
Norbert writes:
I am backing up a directory structure on a Linux server running rsync
version 2.5.6cvs protocol version 26. Most of the files and
subdirectories are symbolic links to a common 'source' directory
structure. It appears that backuppc is backing up the symbolic links to
Keith writes:
I'm using BackupPC version 2.1.2. It backs up a number of Linux systems
via rsync without any problems. There is one Windows system which is an
Exchange server, and a dump of the Exchange database is put into a
specific share each night. BackupPC backs this up and doesn't report
Alessandro writes:
I ask to backuppc to restore this 2 files by smbclient but I have an
error message like this:
Running: /usr/bin/smbclient 192.168.0.100\\archivio -U ferro -E -N -d
1 -c tarmode\ full -Tx -
Running: /var/BackupPC/bin/BackupPC_tarCreate -h 192.168.0.100 -n 86 -s
Craig writes:
I submitted a bug report to samba in 2003, which is still open.
Here is the bug report:
https://bugzilla.samba.org/show_bug.cgi?id=563
Craig
-
This SF.net email is sponsored by DB2 Express
Download DB2
Stian writes:
[EMAIL PROTECTED]:~$ /usr/share/backuppc/bin/BackupPC_tarCreate -h pontiac
-n -1 -s \* . | /bin/gzip /dev/nst0
which is what backuppc tries to do when archiving to tape. And this does
not work.
As Dan and Ali mention, you can use buffer or dd to reblock
the stream.
You can
Regis writes:
Some big file can't be backuped by BackupPC.
I have a big file (15707164 bytes). When BackupPC try to tranfert it to
the server, BackupPC breaks with this message:
Error reading file \Mes documents\frontispiece.tif : Call timed out: server
did not respond after 20
Gian writes:
I am using a Linux machine to backup 76 windows computers and 3 Linux
servers. All backup is via rsync - Cygwin on the Windows machines.
Incremental backups are working fine, but full backups of the windows
machines contain files and directories 1 level deep and nothing else.
Nils writes:
No, indeed. You need a filesystem that supports hardlinks. Maybe this
should be explicitly stated in the requirements, Craig?
Yes, it should. I should probably make it a run-time test too - actually
create a test hardlink and quit if it fails. I should also do that in
Tony writes:
That's correct. The code winds up looking either in /etc, or in
$TOPDIR/conf, neither of which is correct. On FreeBSD, it would go in
/usr/local/etc/BackupPC (or /usr/local/etc/backuppc, whichever).
Ok, it's fixed. The makeDist script fails to change the hardcoded
value back
Regis writes:
I tested the zize of the new directory
$ while [ 1 ]; do du -sk new; sleep 30; done
And the size is always the same.
Any ideas?
Could some one help me to solve this problem ?
I'd recommend manually running the smbclient command and pipe
the output into tar tvf - so you
David writes:
Is there an easy way to extract information on what files BackupPC has found
are duplicate? I'd like to eliminate duplicate files on my system, and since
BackupPC has already gone to the work of identifying them, I'd like to get
some sort of report listing the paths (relative
Garith Dugmore writes:
When using the following configuration command
Conf{DumpPreUserCmd} = 'rsync -az --delete ethleen.saao::backupreadonly
ctfileserver.saao::read_only';
backuppc reports in the log:
2007-06-06 16:58:08 DumpPreUserCmd returned error status 256... exiting
BackupPC
Maikel writes:
i'm trying to configure backuppc with the client server principe, so
the web interface runs on a different host then the actually backuppc
program.
Now i found the ServerPort setting in the config.pl file. But do i
have to install / do on the client?
so basically what i
Andrey writes:
I've got a little strange problem when using backuppc with
xferMethod=rsyncd..
When incremental backup starts, backuppc creates 'new' directory in
per-host spool. It's cool. But this directory contains all folders and
files from the destination host. I'm not newbye with rsync,
Tony writes:
Anyhoo...I tend to think we have at very least a bug in configure.pl,
but moreover we need to be a bit more flexible on being able to define
confDir. As was mentioned before, having the option to pass it on the
command line would clear this right up.
There are two different
Keith writes:
My understanding is that when BackupPC is doing an incremental backup via
rsync, only the files that have changed attributes since the last full
backup are examined.
I have a backup running on a client that has just over 100,000 files
using 24Gb, and monitoring with 'watch
Sean writes:
Greetings friends - I have a batch of Linux servers that I'm responsible
for. I've managed to get all but one of them set up on backuppc. This
particular server has had several failed attempts at full backups. I
ran the rsync command manually with strace to try and figure out
Regis writes:
Backup stop with the message:
Error reading file \Local Settings\Temp\Cover picture.tiff : Call timed
out: server did not respond after 2 milliseconds
If you look at the mail lists you will see that this error is
typically due to anti-virus software running on the client
Murtuza writes:
I was just wandering how would i only give user the right to restore
as the user should not be able to access the edit config at interface.
Can anyone plz help me how to restrict user to edit their config file
at interface.
$Conf{CgiUserConfigEditEnable} = 0;
You can
Tony writes:
I've just about put the finishing touches on packaging up BackupPC for
FreeBSD. Works pretty well, but I ran into a snag early on. I've
worked it out, but somehow I don't think the way I went about it is the
right way to do it.
I got the good old problem of:
Francis writes:
On my side, I found that Firefox (PC) handles correctly the direct download
(for the file name)
My results so far are :
On client (windows) conf client charset is cp1252, backup method rsynd
On Server (ubuntu LTS) locale is UTF-8
CGI- OK
LS directly os server = OK
Rodrigo writes:
The only problem seems to be with this index file and status.pl. I
just removed status.pl, and backuppc re-made it. I dont know if this
was the best thing to do in this situation, but everything else seems
to be fine.
That's perfectly fine. status.pl will be rebuilt and,
Stefan writes:
I've been using BackupPC for ages and have been anticipating this
feature very much. However, my real-world results from BackupPC 3.0.0
don't really show this behaviour. My results look as follows:
15 full yes 0 21/4 02:00 43.5
19 incr no 1
Holger writes:
hmm, that seems to mean that rsync full backups are done relative to the
previous backup (be that full or incremental). Actually, that is quite
brilliant. File contents are checked anyway, so why not start with the last
state, even if that is not fully trusted? It's even known
[EMAIL PROTECTED] writes:
In the attrib file would seem correct...
I deleted 'TEST', did an incremental, and ran BackupPC_attribPrint on
the attrib file located inside the corresponding directory:
'TEST' = {
'uid' = 0,
'mtime' = 0,
'mode' = 0,
'size' = 0,
Joe writes:
I have traced this error down to what seems to be a tar problem.
I have a tar file with one file in it: test.tar
If I execute:
cat test.tar | tar tf -
I get the above error.
tar tf -test.tar
and
tar tf test.tar
work correctly
The BackupPC_tarCreate also seems to be
Ski writes:
Craig, thank you again for the great software. It has been a life
saver for the Northshore School District. Most recent stats are 1300
clients with about 4TB data across all clients, 8 BackupPC servers
with 3.4TB of data (love the hardlinks and compression) keeping 2 fulls
and
nilesh writes:
full backup started for directory home
Connected to 192.168.2.149:873, remote version 29
Negotiated protocol version 28
Got response: f37028cf3d2f8dc751a16506ab234ddf
Auth: got challenge: f//wNxSOI0+aIZA9uUIjZA, reply: backuppc
83Aozz0vjcdRoWUGqyNN3w
Connected to module home
Samrat writes:
Fatal error (bad version): Xlib: connection to :0.0 refused by server
I already replied to the email you sent me directly (which I prefer
not to do since that's what the user list is for).
I said:
Your ssh is configured to do X port forwarding and gives this
error:
Alex writes:
I've just upgraded to version 3, and all of a sudden my archive went
from 7 hours to under 1 hour.
Do you really mean archive, or do you mean backup?
Did you upgrade using the vanilla tarball and configure.pl?
What XferMethod are you using on this host?
I noticed that the
Les writes:
I have a machine thats been running backuppc for ages (still on 2.1.2).
All data is in /home/BackupPC, a 200gb drive sitting on Fedora core 3.
BackupPC runs as apache. /usr/local/BackupPC is the program directory.
I want to install a second instance of backuppc directly onto a
Ben writes:
I want to now if I can add a logo with link under links in the left
column of the web interface. Which file(s) have I to modify?
You can add extra links to the left navigation bar, but not images,
using $Conf{CgiNavBarLinks}.
The BackupPC logo and SF.net links are hardcoded.
Jamie writes:
I testing backing up my home directory on my desktop machine using
backuppc 3 and tar. The client is an Mac OSX 10.4.9 machine.
No matter what i do i keep getting this message.
Running: .
full backup started for directory /Users/jamie
Xfer PIDs are now 21700,21699
Exec
Alex writes:
Ok, it's fixed now. I might have lost the IncrLevels setting when I
was trying to fix admin user problem
Ok. For 3.1.0 I'll make sure it defaults to something sensible
in case it is missing.
Craig
-
This
Brendan writes:
So are you saying that 3.0.0 has no known critical bugs, and that 3.1.0
would have minor bug fixes/improvements and some new features. If so,
then I should be confident in upgrading from 2.1.1 to 3.0.0 and not have
any problems, right?
Has anyone had problems upgrading
Alex writes:
I'm running 2.1.2pl2 on Centos 4. Can I just stop the service, run
configure.pl, and start the service again?
Yes.
Craig
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE
Holger writes:
I'm not sure whether rsyncd authentication is more than a plaintext
password exchange though.
It uses a random challenge/response with md4 digests.
Like any system like this, weak passwords are vulnerable to
dictionary attacks. But if you use a decent (high entropy)
password
Jamie writes:
I have a fresh install of Backuppc 3 up and running on a centos box.
We're mainly backing up Apple Xserves.
It's backing 2 of them up sucessfully. But on one of them it will do
the full backup no problem but when it runs incramentals it keeps
giving me this error.
Backup
Vasan writes:
For the next backup, it always seems to check for any modifications
during the last 24 hours. If for example, the next days nightly job
takes only a minute to complete (8:01PM), then first it will check if
any backup has been done for the last 24 hours at 8:01PM itself. In
this
Simon writes:
I get the following error message ibdata1: md4 doesn't match: will
retry in phase 1; file removed for this file in my Xfer Log. Its file
size is about 50GB and it could be that the file is modified while the
backup is running. My version of BackupPC is a Ubuntu package which
Pradeep writes:
I wonder how can I instruct BackupPC to backup only a certain directory.
Attached is the screen shot of my configuration .
Here I have defined the smbshare as 'C$ and '/NessusDB' , but BackupPC
instead of backing up /NessusDB backs up the whole of 'C$'.
In the CGI editor
David writes:
I'm using backuppc to backup several computers and I have troubles with
filenames containing accents : when browsing backups on the backuppc
server, accents aren't displayed correctly. That's quite annoying.
I tried to set the $Conf{ClientCharset} variable to ISO-8859-1 or
Pradeep writes:
I am trying to make use of the Smbclient perl module for backing up my WinXX
machines. The steps I have taken are as detailed below
1. Installed the perl module
2. Loaded the perl module SmbClient.pm using the parameter
***$Conf{PerlModuleLoad}
After this step I do not
Jorg writes:
(27GiB, 7,0MiB/s))
Backup aborted (Gesamtzahl geschriebener Bytes: 28282449920 (27GiB, 7,0MiB/s))
Is this a localization issue? Doesn't backuppc understand what tar is saying?
Yes and yes.
How can I fix this?
This is the standard config setting:
$Conf{TarClientCmd} =
Eric writes:
Does the directory /data/BackupPC/backups/cpool/5/e/e exist? What happens
when
you try to manually make the link:
su backuppc
link /data/BackupPC/backups/pc/slacker/2/fEric/fhello.txt
/data/BackupPC/backups/cpool/5/e/e/5ee4aa1b190383553c1a7712ad260358
Various parts of BackupPC spend a lot of time traversing large
trees of files, including BackupPC_dump, BackupPC_trashClean
and BackupPC_nightly.
As many people have observed, over time BackupPC's pooling results
in directories with files that are widely dispersed across the disk.
This makes disk
Simon writes:
So far I've determined that $Conf{TrashCleanSleepSec} = '300' isn't
going to be doing me any favours. I've bumped it up to once an hour (and
might do so to once a day; I'm only backing up three machines and the
data doesn't change at a frantic rate).
Is this safe?
Yes.
Eric writes:
2007-04-08 21:20:02 BackupPC_link got error -4 when calling
MakeFileLink(/data/BackupPC/backups/pc/slacker/2/fEric/fhello.txt,
5ee4aa1b190383553c1a7712ad260358, 1)
The -4 error means that a file cannot be added to the pool
(ie: a new hardlink
Alex writes:
I'm getting an error backing up users' data. Below the error log:
Error reading file \Local Settings\Application
Data\Microsoft\Outlook\Outlook.pst : NT_STATUS_FILE_LOCK_CONFLICT
Didn't get entire file. size=2186036224, nread=0
NT_STATUS_SHARING_VIOLATION opening remote file
Bernhard writes:
WORK/BMW/zk_zkg_bg31_gj_nonconv_S.odb: md4 doesn't match: will retry in
phase 1; file removed
What versions are you using? There was a bug related to this that
was fixed in BackupPC 3.0.0 (specifically BackupPC 3.0.0beta2).
Craig
David writes:
I installed version 3 to be able to access old archives that had no
entries in file backups. But when I run the recovery tool I get:
The recovery program by default relies on new meta data that
is stored (below each pc/HOST/nnn directory) when a backup is
made under 3.x.
If you
Gregor writes:
and related messages for background information. In his final mail on
that thread, Craig said
Upon further inspection, it turns out the rsync XferMethod
doesn't check the hardlink limit when it is linking to
an identical file. So there are two cases you have found
John writes:
I'm using BackupPC and find it great.
Backups work fine, but restores over rsync and ssh are failing with an
rsync error:
Running: /usr/bin/ssh -q -x -i /home/backuppc/.ssh/identity -l rbackup
www.myhost.com /usr/bin/rsync --server --numeric-ids --perms --owner
--group -D
Evren writes:
I use rsync with 3.0.0 but it was the same speed with 2.x.x versions.
$Conf{IncrLevels} = [1];
I wonder, since I have:
$Conf{IncrKeepCnt} = 6;
Wouldnt it make more sense to use this?:
$Conf{IncrLevels} = [1, 2, 3, 4, 5, 6];
or does this make
Michael writes:
Michael Mansour wrote:
I have servers which have multiple full backups and incrementals, but
neither
is removed.
I don't think they are removed until the replacements are completed.
Is there a way I can just manually delete some of the backups
Winston writes:
I hadn't thought about the file system being full. After checking just
now, this is not the answer. /var/lib has 48G available on my main hard
drive. /var/lib/backuppc, to which the spare hard drive is mounted, has
59G available.
The directory /var/lib/backuppc/log is
Winston writes:
I had been running BackupPC on an Ubuntu computer for several months to
back the computer to a spare hard drive without problem. About the time
I added a new host (Windows XP computer using Samba), I started getting
the following behavior:
BackupPC backs both hosts properly
Nils writes:
The documentation shows I can use ranges to specify the
WakeupSchedule. However, when I enter 1, 8..23 in the GUI editor I
get an error. Not even 8..23 is accepted. It should be right? I put
int 1, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23
for now,
Craig Barratt writes:
Got checksumSeed 0x69647473
Read EOF:
The checksumSeed doesn't look right. If you don't specify it on
the command line, it should default to unix time().
In your case it comprises the four ascii characters idts. That
isn't coming from rsync. Hopefully
vladimir writes:
I just installed backuppc on debian box with apt-get install backuppc.
It installed version 2.x.x.. Test it thourgh webinterface and it worked .
Then I decided to upgrade to latest stable release. From here everything
went down..
Backuppc does not work.
Did you upgrade
Bernhard writes:
I just discovered that backuppc (2.1.2pl1) managed to backup a 11,67GB
file via rsyncd-cygwin ... how comes?
Are the known limitations on
http://backuppc.sourceforge.net/faq/limitations.html#maximum_backup_file_sizes
deprecated?
Yes, it's old. Cygwin now supports large
Aaron writes:
Neither of the archive links on the web site work:
On this page: http://backuppc.sourceforge.net/info.html#lists
Link is broken:
http://sourceforge.net/mailarchive/forum.php?forum_id=503
I've fixed these links; now there are:
Arjun writes:
The server claims that it is unable to ping any of the hosts defined
in the hosts file. However, when I use `nmblookup hostname`, the IP
address that is displayed corresponds to the host, and a subsequent
pinging of that IP address is successful. What, then, could the
problem
Niels writes:
I do not see an error message here, just the abborted :
full backup started for directory /hsphere
Running: /usr/bin/ssh -q -x -l root 192.100.100.1 /usr/bin/rsync --server
--sender --numeric-ids --perms --owner --group -D --links --hard-links
--times --block-size=2048
Niels writes:
I installed Backuppc 3.0 and the interface is running.
But when I backup I get this error:
unexpected repeated share name
Sounds like $Conf{RsyncShareName} has two entries that
are the same. Remove the repeated entry.
Craig
Adam writes:
I must admit I agree with Bruno on this matter, and am personally VERY
interested in seeing such a feature in BackupPC, but I don't think
it's worth my while forgoing the other excellent features of BackupPC
by using another solution such as rdiff-backup to gain it. Under
Martin writes:
Starting BackupPC: Couldn't execute //var/lib/BackupPC//conf/config.pl:
Can't modify single ref constructor in scalar assignment at
//var/lib/BackupPC//conf/config.pl line 390, near 6.97;
BackupPC::Lib-new failed
Looks like there is a typo around or before line 390 of
the
Rick writes:
Someone recently brought up some questions about securely backing up
laptops with backuppc, and it prompted me to ask about some questions
which I've been pondering.
I'm currently backing up several systems including a laptop which
happens to be running ubuntu linux.
Now
Jesse writes:
Fatal error (bad version): stdin: is not a tty
The remote shell says stdin: is not a tty. An rsync version
is expected instead, hence the error.
I suspect you have an stty in bbBackup's .login (or .bashrc etc).
There was some recent discussion on the mail list about how to
stop
David writes:
However, tonight I decided to decrease the max number of backups saved
and got the following messages:
Error: No save due to errors
Error: SplitPath must be a valid executable path
$Conf{SplitPath} doesn't point to a valid executable. That's
used for archive.
Go to
Jim writes:
Thanks for your personal attention on this. The XferLOG is now available in
the images directory:
http://mail.stephanco.com/backuppc/images/
The log from the email you replied to has since been overwritten since it
was only considered a partial backup. The emails yesterday
Schnell writes:
First of all I'd like to thank the programmer's for their great work.
Nice tool guys!
Thanks.
Most people who are using BackupPC 3.0 (Final) in our company are
speaking german, but fore some it would be nice if we could change
the language e.g. into English, France ...
901 - 1000 of 1316 matches
Mail list logo