Nicolas MONNET writes:
To save on bandwidth (I'm using backuppc to backup servers from a
datacenter to my office), I want to run incremental backups as much as
possible.
I believe it should'nt be too hard to write a tool to transform an
incremental backup into a full backup.
* I
KOUAO aketchi writes:
I meet a problem when i backup a pc which has its ip address changed.
When one of my windows pc has its ip address changed , backuppc sends
a message : inet connect , connection refused . What is the reason
of this failure?
If you mean the IP address change occurs
Khaled Hussain writes:
For one of my XP hosts I dont seem to be generating Xfer log files only a
LOG file...I am getting 'child exitted prematurely' error after 1 hour since
backup starts for this host in the LOG file and that's all it says - I
understand the Xfer log is useful for debugging
Nick Barton writes:
Sorry if this question has already been answered, I think it is an easy
fix but I am just not finding it anywhere. I need to be able to backup
multiple host machines from one backuppc computer on different schedules
through out the week, I think a total of 15 servers.
Sim writes:
Some time I have error report, from tar.gz files.
( BackupPc download it with rsync )
You can see this report with One Error.
Connected to srv1.lan:873, remote version 29
Connected to module backup
Sending args: --server --sender --numeric-ids --perms --owner --group
Bryan Penney writes:
On the status page the pool information is reported as
Pool is 227.53GB comprising 1157013 files and 4369 directories (as of
2/28 08:33),
When I run df -h /dev/sda3 (the raid backuppc is on) I get:
FilesystemSize Used Avail Use% Mounted on
/dev/sda3
Dan Pritts writes:
An idea i had for offsites is to just run rsync against the raw device.
rsync would need to be patched to allow this, and apparently rsync has
some issues with very large files.
This should work well. However, rsync currently doesn't copy
device file contents, it just
[EMAIL PROTECTED] writes:
Hi -- I'm somewhat confused with how I thought incremental backups work and
with what is shown in XferLog. Perhaps some kind soul can help me out? I'm
running BackupPC 2.1.2 on FC4 against Linux and WinXP hosts.
(1) When I look in XferLog, every directory is listed
Stephen Vaughan writes:
Is there anyway to get backuppc to continue to backup regardless of
errors? I had it backing up a 2gb db file, and during the transfer the
file was modified and backuppc recognised this and aborted the backup.
Remote[1]: send_files failed to open
Justin Best writes:
Thanks for your help, Craig! I appreciate it so very much.
I've filed a bug report with Samba, as you suggested.
https://bugzilla.samba.org/show_bug.cgi?id=3592
Out of curiosity, though, it seems that you still have an open bug that
references the same problem. In
Vin cius Medina writes:
I am having this problem with BackupPC:
Running: /usr/bin/ssh -q -n -l USER* HOST* /bin/gtar -c -v -f - -C
/arquivos/ --totals .
What happens when you run this:
su backuppc
/usr/bin/ssh -q -n -l USER* HOST* /bin/gtar -c -f - -C /arquivos/ --totals
. | tar
Steve Willoughby writes:
I'm just getting started with BackupPC and up until the last machine I added,
things were going fairly well. I'm backing up to a 250Gb external USB drive,
which is being incredibly slow. It's taken it something like 4 days to
perform
the BackupPC_dump operation,
Laurent Mazet writes:
To summarize, for a Windows host:
- rsync over ssh doesn't work.
Yes, but I haven't tested it recently.
- rsyncd transfers only diff but you need to connect with a clear password.
Rsyncd doesn't send a clear password over the network. It uses
a digest-based
Matthias Bertschy writes:
We are using Backuppc as our main backup solution for a dozen servers.
One of our servers has a Perforce server, so we need to checkpoint its
database before backing up (it could also be a mysqldump).
Some days ago, we detected a problem on the server preventing
Steve Willoughby writes:
Can I assume that if BackupPC_link gets interrupted (say, by a system
reboot), that re-running it will continue linking the backup files into
the pool from where it left off, and not start over or get confused?
Is there any special procedure for doing this correctly
Khaled Hussain writes:
1. How can I clear all pending backups other than by going into each
host being backed up and stopping it?
The only other way is to kill BackupPC. Not very graceful...
2. At every wakeup, does backupPC reload the per-pc configs? Or do I
need to restart backupPC
Andy writes:
I have been using BackupPC 2.1.1 (from Debian) to backup a number of
linux hosts over RSync for some time without any difficulties.
But recently I added a Windows 2000 server with several shares to be
backed up over SMB and am getting the following error in the Xfer log:
Tomasz Chmielewski writes:
Tomasz Chmielewski wrote:
Tomasz Chmielewski wrote:
Tomasz Chmielewski wrote:
(...)
Unfortunately, it breaks in the same way:
2006-03-28 13:24:20 full backup started for share S$
2006-03-28 13:33:09 Got fatal error during xfer (Didn't get entire
Tomasz Chmielewski writes:
Does this mean that in 3.x it will be safe to use rsyncd for Windows
hosts (now special utf8 characters are translated to ?)?
Yes.
Craig
---
This SF.Net email is sponsored by xPML, a groundbreaking scripting
Tomasz Chmielewski writes:
Les Stott wrote:
Hi,
I have a client with 50 or so pcs, all doing rsync backups throughout
the day to a backuppc server. Works Great.
We also have a linux box running a Cyrus Imap Store and we were running
rsync backups too, just getting the cyrus
Rein writes:
This is likely due to me not speaking Perl, but what's the best way to
achieve subj?
Here's how it works for me now -- is there a better way?
I'm using an expect script to shut the computer down after the backup
completes, but I don't want the output of that script to end
David Mansfield writes:
I have a file in the cpool with 32000 links. HardLinkMax is set to
31999. Let's assume this is exactly correct, because one link is
reserved for the 'pool'.
However, last night during the second full backup ever for this host, I
got the following kind of error
Lee A. Connell writes:
Which files contains the left navbar, I want to add some more links to
the navigation.
There's a config variable for this: $Conf{CgiNavBarLinks}.
Craig
---
Using Tomcat but need to do more? Need to support web
Curtis Preston writes:
My name is W. Curtis Preston, and I'm the author of Unix Backup
Recovery from O'Reilly. The first edition sold over 40,000 copies, and
we're looking to update it. One of the things we did was add a chapter
on BackupPC.
The chapter was written by Don Duck Harper,
Lowe, Bryan writes:
I am trying to install BackupPC for the first time. I have done all the
preliminary work, downloading and installing everything I need to run
BackupPC on my Solaris 9 box. The BackupPC documentation says to next
untar the BackpuPC-2.1.2, then apply any patch that's
Ralf Gross writes:
I was looking at the backuppc home page and the sourceforge project page
for the bug tracker that is backuppc is using. I couldn't find any info on
how to file a bug, what is the recommended way to do this?
BackupPC doesn't use the SF bug tracker. Just post to the devel
or
Mark Coetser writes:
I am having a little trouble getting this running, I have read the docs etc
Here is my config.pl for the archive host
# Set this client's XferMethod to archive to make it an archive host:
$Conf{XferMethod} = 'archive';
# The path on the local file system where
Bill Hudacek writes:
One small note, if I may...I'm not sure about others' thoughts on this
matter, but I shared your OMG moment, Travis :-)
When I discovered BackupPC, I was very pleased with it. I still am,
lest anyone think otherwise!
However, I was very dismayed to find the word
David Rees writes:
On 6/7/06, Craig Barratt [EMAIL PROTECTED] wrote:
There is a new version of File::RsyncP that is close to release
that you could try. I can email it to you if you want.
Out of curiosity, what's new in File::RsyncP?
Support for hardlinks (also requires BackupPC 3.x
Ambrose Li writes:
On 05/06/06, Víctor A. Rodríguez [EMAIL PROTECTED] wrote:
- copy fadsutil.vbs to a new locatoin, try to bunzip2 it and if susccessfull
you'll have a fadsutil.vbs with the same length an content that the original
one
This method won't work. Backuppc seems to add a
Michael Zehrer writes:
I'm doing a smb type backup and I recieve tons of errors like this in
the Xfer.log:
tarExtract: [EMAIL PROTECTED]
tarExtract: i/rd3s452C[2l0['E$].;OW/U+g'\D-U^YA4aePeMD05\8`hEODp#=BE_*lYnJ
tarExtract: VQ;N*A@/IBG,ZKBJpbJMIWBESI#hQ3Xi)d7Vp3u1bH11%BggPpTk0Epth`9D
Paul writes:
I'm doing my backup using the rsyncd service on the PC's.
I'm also quite new to backuppc.
I was wandering why the rsyncd method uses as base the last full,
and is not taking into account the last incremental.
My experience with rsync is mostly Unix, and there I can mirror
a
Nils Breunese (Lemonbit Internet) writes:
Travis Fraser wrote:
What version of rsync are you using? Later versions need --devices
changed to -D in $Conf{RsyncArgs}.
I guess I need to change this for $Conf{RsyncRestoreArgs} as well?
Yes.
Craig
Travis writes:
I was trying to restore a folder so first I got the ok message and the
request was sent. However, the process didn't start until 2 hrs later.
Today again, for another restore job, the request was sent at 7am but
the
The log shows:
7 success 6/14 08:53
Thomas Maguire writes:
I want most systems on my network to backup overnight but I have
several notebooks that are only attached to the network during
business hours. I reviewed the blackout settings in config.pl and
wanted to know if I was interpreting them correctly.
It seems that if I
I just released a new patch for BackupPC 2.1.2. This changes
the --devices rsync option to -D in config.pl to fix the
fileListReceive failed bug with recent rsync versions.
This patch includes the earlier fixes in the prior pl1 patch.
The patches can be applied to a fresh 2.1.2 release
by
Shohan writes:
I want to keep 6 last incremental that will be done automaticly. so my
configuration is as below
My configuration:
$Conf{FullKeepCnt} = 1;
$Conf{FullKeepCntMin} = 1;
$Conf{FullAgeMax} = 30;
$Conf{IncrKeepCnt} = 6;
$Conf{IncrKeepCntMin} = 1;
$Conf{IncrAgeMax}
shohan writes:
I dont want to make auto full backup so
$Conf{FullPeriod}= -1
$Conf{IncrPeriod}= 0.97
That means disable all backups, both full and incremental.
Craig
On 6/23/06, Craig Barratt [EMAIL PROTECTED] wrote:
Shohan writes:
I want to keep 6 last incremental
SAJChurchey writes:
I'm trying to restore a full backup to a newly installed server after we
re-installed the OS. I'm trying to use rsync method. Whenever I try to
restore files I get the following errors on the entire pool.
Remote[2]: skipping non-regular file filename
What could be
Kai Grunau writes:
I'm using backuppc version 2.1.0pl1 on RedHat Enterprise 3 server
before last weekend I had no problem but since then no BackupPC
is running on 2 Solaris machines (no problem with the other linux
computer).
I found in the Xferlog file following message :
I have released BackupPC 3.0.0beta0 on SF at:
http://backuppc.sourceforge.net/
A new version 0.62 of File::RsyncP that is needed for rsync
hardlink support has also been released on SF and CPAN.
3.0.0beta0 has some substantial new features compared to 2.1.2.
New features include:
*
John writes:
On 7/11/06, John Villalovos [EMAIL PROTECTED] wrote:
Searching through the email archives it seems that it is a known issue
that BackupPC will exceed the hard link count specified as a maximum.
And then I got the impression that there is a yet as unreleased
version which
Ilya Rubinchik writes:
What is this??
2006-07-15 09:05:25 Backup failed on aaa.com (fileListReceive failed)
2006-07-15 09:05:27 aaa.com: overflow: flags=0x63 l1=111 l2=1819243374,
lastname=dev/log
2006-07-15 09:05:27 aaa.com: overflow: flags=0x65 l1=0 l2=-251658240,
lastname=dev/log
Frank writes:
I have just installed Backup PC and have three hosts. When I access the UI to
view the activity of the host I get the following message:
Error: Only privileged users can view information about host store1.
I am not sure why I can not view this host info. I set up the host name
Nilesh writes:
I am continuously getting the following entry in log file and it increase.
--
2006-07-21 17:09:40 can't exec
/usr/local/BackupPC-2.1.2/bin/BackupPC_trashClean
for trashClean
Does that path exist? Is that file executable? Does the
BackupPC user have permission to
Nicholas writes:
You must remember that by default BackupPC runs as user backuppc with
limited access. You could use sudo over SSH for local backups.
i.e. $Conf{RsyncClientCmd} = '/usr/bin/sudo $rsyncPath $argList+';
...plus drop the + from $argList:
$Conf{RsyncClientCmd} =
Matt writes:
I am using the rsyncd transfer method and would like to have each new backup
be base to the most recent incremental, thus completely avoiding full
backup after the first initial backup.
Is this now possible with version 3.0.0 beta? Maybe $Conf{IncrLevels}
can be use to this
ken writes:
Anyone having problems with File-RsyncP-0.62? It won't install with cpan.
Linux: 2.6.17-1.2142_FC4
Perl: perl-5.8.6-24
There is a new version File-RsyncP-0.64 on SF and cpan that
fixes this problem.
Craig
-
Vincent writes:
I try to run backuppc beta 3 on my ubuntu server.
I installed and run configure.pl script. There were no errors.
But when i try the init script /etc/init.d/backuppc start i have the
following permanent error.
Starting backuppc: No language setting
BackupPC::Lib-new failed
Fred McCann writes:
I'm trying to install BackupPC-2.1.2 on FreeBSD 6. I'm entering the
BackupPC-2.1.2 directory and running as root:
./configure.pl
I get through all the questions and then this happens:
Ok, we're about to:
- install the binaries, lib and docs in
Tony Molloy writes:
I've just upgraded backuppc to 3.0beta1. The upgrade went OK and I did a
small test backup which went OK. I'm having several problems with the
Cgi interface though.
When I look at the Log File I sometimes get the following:
Software error:
Undefined subroutine
Jonathan writes:
I have updated my system to the 3.0.0beta1 and wen i try to start the
backuppc server i always get this error:
2006-08-10 13:38:09 Another BackupPC is running (pid 3151);
quitting...
BackupPC reads the PID file (eg: $TOPDIR/log/BackupPC.pid)
and checks if that process
Cameron Dale writes:
I'm backing up several different machines on my local network to my
debian-based server using BackupPC. I'm using rsyncd for all of this,
2.6.8 on the Linux machines, and cygwin-rsyncd-2.6.2_0.zip on the
Windows machines. One of the machines is even dual booted, and the
Cameron Dale writes:
Craig Barratt said the following on 11/08/2006 1:47 AM:
Yes, the entire set of files is being transferred on an
incremental with a linux boot.
But why is this happening? What is the difference between Windows and
Linux that would cause this?
That means some meta
Chris Stone writes:
Had a bit of time to spend on this and I did get the upgrade installed
without having to hack the scripts at all. I DID have to upgrade the
Encode package to 2.18 and that took care of it all and the install
completed and backuppc started successfully.
Good detective
Benjamin Kudria writes:
On Friday, August 11 2006 1:56, Loyd Darby wrote:
Try this(note the extra comma:
$Conf{BackupFilesOnly} = ['/home/bkudria',
'/usr/local/vpopmail/blueboxtech.com',];
I am guessing that it isn't recognizing the end of the string and is
re-using what is in the
Nicolai Nordahl Rasmussen writes:
- I thought of using the $Conf{BlackoutPeriods} to simply define the
whole sunday as a blackout period, but I'm afraid the incremental
backup would then just be pushed to run monday instead?
This should work correctly. Yes, Monday will do an incremental
Ambrose writes:
I am using 3.0.0beta1 and I am seeing that in the Status screen, almost
everything is reported as zero (as copied below). I am wondering if others
are seeing this or if I have done something wrong (maybe a permission
problem?), or if this is just a case of something being not
Ambrose writes:
On 12/08/06, Craig Barratt [EMAIL PROTECTED] wrote:
Just to confirm: this was an upgrade, and it used to work?
This is a new installation. However, I did at first install it to run
as user daemon before managing to figure out how to create
a separate backuppc user
Cameron writes:
I'm thinking of changing perms, owner, group, and maybe times all to
no-OPTIONs. However, I'm concerned about how this will affect the
program as I don't understand how File::RsyncP works, as it says in
the comments. Can I go ahead and do this? Do I need times off or just
the
Nathan writes:
5. Confirmed that backups succeed when I do su - backuppc and start
BackupPC manually with the -d option.
However, backups fail (fileListReceive failed) whenever I reboot and
just let the provided init.d script start backuppc, I presume because
there's never a chance to
Cameron writes:
On 8/14/06, Craig Barratt [EMAIL PROTECTED] wrote:
Cameron writes:
I'm thinking of changing perms, owner, group, and maybe times all to
no-OPTIONs. However, I'm concerned about how this will affect the
program as I don't understand how File::RsyncP works, as it says
Nicolai Nordahl Rasmussen writes:
I only have the local dns in the resolv.conf file
andromeda:~# cat /etc/resolv.conf
search corena.dk
nameserver 10.5.0.3
I've added an entry for the server in the hosts file:
andromeda:~# cat /etc/hosts
127.0.0.1 localhost.localdomain localhost
Nathan Barham writes:
Thanks for the reply. I took the Last error is fileListReceive
failed from the Backup Summary page for the host in question. I should
have posted from the actual error log, which has this ...
SNIP ...
Fatal error (bad version): Permission denied
Marc Prewitt writes:
We've been trying to troubleshoot one of our bigger dumps which keeps
failing silently after a few hours. It's actually not that big in terms
of files (9221) but the files are rather large. They are btrieve-like
database files.
I tried running the BackupPC_dump
David writes:
I am having a problem using BackupPC over a slowish VPN link with large
files. If BackupPC is aborted (mainly due to signal=ALRM) when
transferring a large file, BackupPC appears to delete the partially
transfered file, preventing Rsync from restarting where it dropped off.
Les Stott writes:
When i edit Email Settings, Dest Domain or other and save, it blitzes
the config.pl file, and BackupPC cannot start.
its all to do with the Email headers section.
Previously I had this in the config file.
$Conf{EMailHeaders} = EOF;
MIME-Version: 1.0
Vinicius writes:
I want to do backups every 2 hours, on a single host, using
backuppc.In order to do this, without sucess, I included on the PC's
config.pl these settings:
$Conf{BlackoutGoodCnt} = -1; #in order to not use blackout
$Conf{WakeupSchedule} = [10,12,14,16,18,20,22]; #to make it
David writes:
Les Mikesell wrote:
On Thu, 2006-08-17 at 03:23, David Simpson wrote:
$Conf{ClientTimeout} = 3600*6; # 6 Hours!!
Can you crank it much higher and try to get your first run
to complete over a weekend? If you are using rsync over ssh
and
Toby Johnson writes:
I'm a bit confused with the description of multi-level incrementals in
3.0.0b1. Does this mean in plain English that incrementals will only
backup files that have changed since the previous incremental instead of
since the previous full backup?
Yes.
Do these
David Koski writes:
I made some changes to config.pl to reduce the number of backups:
$Conf{FullKeepCnt} = [2,0,0,2,0,0,2];
..whereas before I had [2,2,2,2,2,2,2];
Now I need to recover files from a year ago and they are actually still there
but do not show up in the GUI. The GUI only
Rodrigo Real writes:
Yes, but your home path has some strange spaces, it should be
something like this:
backuppc:x:101:407:added by portage for backuppc:/var/lib/backuppc:/usr/bin/sh
sh usually is in /bin, but I am not sure about Gentoo. If you still
have trouble on this, check your sh
Les Stott writes:
David Wimsey wrote:
Let me start off with sorry if this has been answered and I didn't find
it in the archives, I'm betting someone else has already asked, but here
it is anyway.
I had an instance of BackupPC running for nearly a year and had to stop
for various
naroza writes:
I'm a little confused. For those that don't know me, I'm running a Gentoo
instalation, trying to get backuppc v2.1.2 running so that I might backup some
Linux clients via rsync.
I have the following in /etc/backuppc/:
-rwxr-xr-x 1 root root 64379 Aug 23 02:35 config.pl
R G writes:
I am further testing my Backuppc setup. I just attempted to do a restore of
I temporary directory that I created the night before in the /etc directory.
I deleted the directory and attempted to do a restore. I get the following
error in the log:
Running: /usr/bin/sudo
Nils writes:
Trasher wrote:
I use BackupPC for my local network backup and all is running very
fine :)
I'd like now to backup a website, which I can only acceed by ftp
protocol.
A simple wget command does actually the trick, but I'd prefer using
backuppc capabilities. Is it
Sturla writes:
I restored to the same directory as the backup was taken from and the
symlinks LOOK ok when I stat them, but they just won't work.
If I delete the symlink and create it again with the info from stat it
works, but it's kinda tedious doing this on every symlink.
I tried to write
Johan Str m writes:
I just got BackupPC setup and working, (what a nice piece of
software :D), when I had a disk crash on the system drive of the
system running backuppc... The disk where backuppc has its top-dir is
however not the same and everything there is fine.
After a fresh
Jean-Michel Beuken writes:
is it serious ?
Running: /usr/local/samba/bin/smbclient 192.168.23.11\\PC-Boumal
-U boumal -E -N -d 1 -c tarmode\ full -TcrX - ntuser.dat.LOG \\Local\
Settings\\Temp\\\* \\Local\ Settings\\Temporary\ Internet\ Files\\\*
full backup started for share
Kevin,
Other users have noted that authentication for incrementals doesn't work
in smbclient version 3.0.23.
I don't know if there is a fix or an understanding of the root cause.
Down-reving smbclient is one solution.
Craig
Stian Jordet writes:
tir, 13,.09.2005 kl. 22.38 -0700, skrev Craig Barratt:
Carl Wilhelm Soderstrom writes:
I'm running out of disk space on my backup server, and it's run out of
space
on a couple of occasions. when it does this, some hosts 'forget' all
their old
backups
Tomasz writes:
I have X hosts, and would like to write a script to retrieve last
backups from these hosts, and save it as host1.tar, host2.tar etc. - to
later write it to a tape.
What is the best method to do that?
Should I just tar the current BackupPC archives like below?
tar -cf
Stian Jordet writes:
Umm, sorry to nag you with this again, but I haven't taken a backup for
almost two weeks, trying to get this fixed first. Was this the only way,
and what am I doing wrong? What info does it need that it doesn't find?
I think I found why it wasn't working. You (seem)
Paul Fox writes:
my backup pool disk is 96% full, and backuppc has stopped doing
backups. i have no problem with that.
i don't look at the PC status page every day, and only found out
that things were amiss when i got one of the your machine hasn't
been backed up for a week messages. it
Jacob writes:
I am running BackupPC 2.1.2pl2. I have a problem that I cannot queue
multiple restores. I request a direct restore of a users files, and that
job qoes into the queue. I then request another restore while the first
one is running. I get a message:
but a job is currently
Tomasz writes:
I want to restore the profile of a user.
I would like to know, how big it is.
How can I do it?
Basically, I would like to check the directory size in BackupPC web
interface - right now it only says the size is 4096, which is perhaps
true from technical point of view,
Shawn writes:
I hope this is something simple -- I can't get my archive to run via cron
(as user backuppc). Nor will it run manually in a terminal.
My command is:
/usr/share/backuppc/bin/BackupPC_archiveHost
/usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2
Paul writes:
i'm pretty confident i'm okay on this, but just to be sure:
i have a host that i've reinstalled with a new OS. i want to
keep its former backups around for quite a while, in case i need
something. the hostname remains the same (stump). in the
backuppc configs, i've done
Ariyanto writes:
I have implement backuppc to backup my servers and
doing great. But I have a
silly question in mind, what is the meaning
$Conf{FullPeriod} = 6.97? Is it
mean a whole week? Can I just put it as 6 without .97?
The goal here is to keep the schedule at the same time of day.
Mikael writes:
I just installed BackupPC 2.1.2 and I can't find how to define per-PC
based directories to backup. At the moment it just backups the same
directories from all hosts.
The web page mentions that this should be possible. Is it only in the
version 3 beta?
You can do this in 2.x
Stephen Hemminger writes:
I am running backuppc server on Ubuntu Dapper package (v2.1.2)
My linux (rsync) clients fail without getting filelist.
XferLog:
Contents of file /var/lib/backuppc/pc/deepthought/XferLOG.bad, modified
2006-11-04 23:12:01
Running: /usr/bin/ssh -q -x -l root
Riaan writes:
Was wondering the same thing myself, also when will version 3 go out of
beta/when will a final version of 3 go out ?
I will do one more 3.0.0 beta by the end of this month.
That should be very close to the final 3.0.0 release.
Even though the 3.0.0 beta releases are quite
Tomasz writes:
It looks exactly the same in the web interface - one letter folder
names, empty inside.
I started the backup manually, and it produces the following error:
/srv/backuppc/bin/BackupPC_dump -v -f windows_server
(...)
98208 ( 3688.7 kb/s)
Jason writes:
Hi all. Nobody has responded to my other messages requesting help, so
I'm trying again. I'm using the 2.1.2 version.
I have one Windows machine that is backing up flawlessly (other than
NT_SHARING_VIOLATIONs that are unavoidable). I have another that is
failing when it
David writes:
On 11/7/06, Craig Barratt [EMAIL PROTECTED] wrote:
I will do one more 3.0.0 beta by the end of this month.
That should be very close to the final 3.0.0 release.
Even though the 3.0.0 beta releases are quite stable, given the
wide deployment of BackupPC I wanted
Stephen Joyce writes:
Is anyone doing windows backups (including open files) using the volume
shadow copy service?
It seems that it shouldn't be too hard to even do bare-metal restores by
combining VSS with fileacl or setacl to record restore NTFS ACLs on
files and dirs.
Does anyone
David writes:
The following messages are in my BackupPC log:
2006-11-10 03:58:39 Botch on admin job for admin : already in use!!
2006-11-10 03:58:39 Botch on admin job for admin : already in use!!
2006-11-10 03:58:39 Botch on admin job for admin : already in use!!
2006-11-10 03:58:40
Dale writes:
It does not look like any of the files are changing during the backup.
Also, I think this only started happening since we upgraded BackupPC from 2
to 3. It is happening on all our backup servers as well, not just one.
Anything I can do to help you debug this, let me know.
Jerry writes:
Been using BackupPC Beta 3.0 for a while, its great to be able to view and
edit right from the web interface. I don't know if anyone else has a
problem with the config.pl file after changing it through the web interface.
Error comes back, can't read config.pl file. This is
I just released File-RsyncP-0.66 on CPAN and SF.
This is a bug fix release. Here are the changes:
- Support turning off --perms option, reported by Cameron Dale.
- Applied patches from Mark Weaver to handle skipping duplicate
file names.
- Added FileList/snprintf.c to handle solaris
101 - 200 of 1316 matches
Mail list logo