nilesh vaghela wrote:
Other 25% pc backcup is dead slow. The data transfer is in 20kbps.
I found few things might cause problem.
1. Space within the directory name. ( I do not know but seems to be)
2. Tree structure
3. ' single quote in directory name cause problem.
Presently we have
Evren Yurtesen wrote:
I am saying that it is slow. I am not complaining that it is crap. I
think when something is really slow, I should have right to say it right?
There is such a thing as tact. Many capable and friendly people have
been patient with you, and you fail to show any form of
Evren Yurtesen wrote:
Totals Existing Files New Files
Backup# Type#Files Size/MB MB/sec #Files Size/MB
#Files Size/MB
245 full152228 2095.2 0.06152177 2076.9 108 18.3
246 incr118 17.30.0076
Evren Yurtesen wrote:
I know that the bottleneck is the disk. I am using a single ide disk to
take the backups, only 4 machines and 2 backups running at a time(if I
am not remembering wrong).
I see that it is possible to use raid to solve this problem to some
extent but the real solution
Michael Mansour wrote:
I'm wondering why the full backups numbered 2 are not going back
down to 1 to free up some space on the server?
In the glbal Schedule, I have the following:
FullPeriod: 6.97
FullKeepCnt: 1
FullKeepCntMin: 1
FullAgeMax: 7
and it's my understanding that backuppc should
Frej Eriksson wrote:
I sent an e-mail to the list last week and got good answers so now i
have tested BackupPC for a short time, the result has been satisfying.
But as always some new questions has poped up. Lets presume that the
server that runs BackupPC and stores all backed up data crashes.
Michael Mansour wrote:
I'm wondering why the full backups numbered 2 are not going back down to 1 to
free up some space on the server?
In the glbal Schedule, I have the following:
FullPeriod: 6.97
FullKeepCnt: 1
FullKeepCntMin: 1
FullAgeMax: 7
and it's my understanding that backuppc
Peter,
For testing purposes, you may reduce the alarm period, but under
practical circumstances, it must be large enough that it doesn't cut off
backups that would finish, had they been given the time to collect
enough file information. The behavior also depends on the transport
mechanism
John,
IMO, the point behind BackupPC is to use cheap, easily upgradeable disk
media to make backups available and easy. That kind of steers me in the
direction of several low-end backup servers, either with separate
storage or all sharing a big fat fiber channel NAS. Buying a high end
OverlordQ wrote:
The Unicode versions of several functions permit a maximum path length
of approximately 32,000 characters composed of components up to 255
characters in length. To specify that kind of path, use the \\?\ prefix.
http://msdn2.microsoft.com/en-us/library/aa365247.aspx
If the whole share is empty, that is considered indistinguishable from a
general failure. You can control that with
$Conf{BackupZeroFilesIsFatal}. Check the docs for more details.
JH
Brendan Simon wrote:
I'm getting a fatal error when backing up an empty directory.
BackupPC server is
Jim,
Here is a snippet from my exclude list, which works using rsyncd on a
Win2k box:
/Documents and Settings/*/Local Settings/Temporary Internet Files/*
The spaces are not a problem, for me at least. But, I did have
considerable difficulty getting rsyncd.conf to behave when I placed the
All versions of Windows have a limit of 250-ish characters maximum for a
full path, including the filename and extension, regardless of file
system. I'm not aware of a lower limit imposed by the file system or
OS, but it's likely related. Are you running Unicode-16 character set
or UTF-16 on
Tareq,
That error means you logged in, but for some reason (usually permissions
problems), your logged in user cannot see those files.
Have you tried to log into that share manually on your Linux box, using
a similar command line? You can leave off a few flags and just log in
to poke around.
Hey all,
I suddenly had an urge to do some work on a particular configuration
file and wanted to determine all the changes that had occurred to it
over the lifetime of its backups. Is there a simple command line tool
that shows all the revisions that have been transferred to my pool,
either
Jeff Schmidt wrote:
On Fri, 2007-02-23 at 16:47 -0600, Jason Hughes wrote:
Basically, I'm looking for a quick way to find the filenames and/or
backup numbers I should use to get at all versions of a particular file.
how 'bout a commandline browser?
something like:
links
http
Jason B wrote:
However, the transfer always times out with signal=ALRM.
[...]
Somewhat unrelated, but of all these attempts, it hasn't ever kept a
partial - so it transfers the files, fails, and removes them. I have
one partial from 3 weeks ago that was miraculously kept, so it keeps
Nils Breunese (Lemonbit) wrote:
Any ideas on how we can reduce the load? More/less nightly jobs? Less
concurrent backups? Other tips? We used to backup 15 servers onto one
BackupPC server, but now almost all of our backups are failing and the
load is through the roof. Can we just go and
Perl likes this:
$string = 'Hello ' . `executethis` . ' test\n';
You probably want to surround the whole string in single quotes, but use
dot-concatenation to string the pieces of command together. I didn't
try what you have below, but I did notice that backticks weren't being
executed if
Rob Shepherd wrote:
Thanks for the reply.
Forgive my ignorance, but if the files are not in a direct access
format, then how does rsync work?
rsync compares local and remote file trees before sending deltas etc.
Does the rsync perl module do some translation magic or somesuch?
I don't
Could be many things:
- Make sure you have added the location/location tags in your
httpd.conf that point to BackupPC.
- Make sure you have restarted httpd so it reads the config.
- Check that your htpasswd file has been created for authorization
purposes. This file will contain all the
Dave Fancella wrote:
I went ahead and did this for now, but it's still not quite the right
solution. The laptop is dhcp because it periodically goes wardriving, so a
solution where I can have it dhcp is best. :) Still, it'll be some months
again before I might need it to leave the house,
Timothy J. Massey wrote:
[EMAIL PROTECTED] wrote on 02/01/2007
12:22:18 AM:
Timothy J. Massey wrote:
rsync: read error: No route to host
This one would concern me most. I thought there was a note somewhere in
the docs that says clients should have reverse DNS set up for them,
When I say 3 different locations, I don't mean 3 different floors of
the same building. I mean three different client sites, miles apart,
with completely different *everything*, including network hardware
brand. Some of them are HP ProCurve switches (our preferred brand)
but nowhere near
James Ward wrote:
it looks like they're going to all get started at the same time again
due to waiting on the nightly process to complete after the longest
of these backups.
Does version 3 get me away from this scenario?
Yes. Version 3 doesn't need nightly processing to be mutually
[EMAIL PROTECTED] wrote:
Maybe I shouldn't chime in, because I've only been half following this
thread, but I can't help wondering if you've looked into all the
firewall/timeout possibilities? Sometimes those settings get hosed during
an upgrade too.
Not a bad thing to look into. I
Willem Viljoen wrote:
I have inserted the username and password required to make backups and
it works, full and inrcremental. When turning Use simple File Sharing
of - incremental backups fail with the message: backup failed (session
setup failed: NT_STATUS_LOGON_FAILURE). My printer
Simon Köstlin wrote:
I think TCP is a safer connection or plays that none rolls?
Also when I click on a PC in the web interface it takes around 20-30 seconds
until the web page appears with the backups which were made. I thought that
would be better with an other connection. But that time is
All of my excludes look like this:
$Conf{BackupFilesExclude} = ['/proc', '/var/named/chroot/proc', '/mnt',
'/sys', '/media'];
$Conf{XferMethod} = 'rsyncd';
$Conf{RsyncShareName} = 'wholedrive';
They seem to work fine. I'm using 3.0.0beta3. Is your rsync share name
correct? Shouldn't your
Clemens von Musil wrote:
if version 3 will turn from beta to stable - will it be possible to
migrate an existing system, with filespool etc., towards the newer
version?
Is it already now possible to outline what I need to do the migration?
You mostly just download and install over the existing
Byron Trimble wrote:
All,
All of a sudden, none of my backups (rsync) are working. I'm getting Unable
able to read 4 bytes for each backup. Any insight?
I had this happen to me when I had an old File::RsyncP version using
protocol 26 trying to connect to rsyncd that was at protocol 29.
As silly as it may sound, I have had some success using VitualPC or
VMWare or similar PC simulators rather than trying to restore a Windows
PC from scratch. The beauty of it is, you can have several sitting
around on the hard drive of the host OS, and when one crash and burns
(as Windows
Arlequín wrote:
Hello, David.
I use a stand alone rsync + cygrunsrv install.
The service rsync.exe is reported as running by user SYSTEM.
SYSTEM has all the perms activated on directory
C:\Documents and Settings\jdoe\Desktop
But I'm getting chdir failde when rsync'ing.
rsync -av [EMAIL
This was happening to me when I was using rsyncd and File::RsyncP on the
server that ran protocol version 26. Upgrading it to run protocol 28
with CPAN fixed my problem. You said ssh+rsync, not rsyncd tunneled
through SSH right? So maybe this doesn't apply to you.
JH
Randy Barlow wrote:
The BackupPC system is a server-pull model. There is no such thing as a
missed backup because the server keeps the schedule. If the server is
down, the backups will run as soon as they are allowed to run (taking
into account blackout periods and minimum uptime requirements). Making
two or
Joe Casadonte wrote:
Using 3.0.0beta3, backup client is WinXP Pro via rsyncd.
I have an 80 GB USB hard drive that I'd like to back up if it's
connected. If it's not, then I'd like the rest of the laptop backed
up. I have 'BackupZeroFilesIsFatal' unchecked. Here's what I get in
the log:
Unfortunately, yes.
What you might want to do is put some of the larger directories in the
BackupFilesExclude folder for that client. Then, do a full backup.
After that backup succeeds, remove one of the excluded folders and
trigger another backup. Rinse, repeat.
This way you will populate
From the documentation:
Other installation topics
*Removing a client*
If there is a machine that no longer needs to be backed up (eg: a
retired machine) you have two choices. First, you can keep the
backups accessible and browsable, but disable all new backups.
Alternatively,
Sorry, I'm not great at deciphering linux diagnostics (I'm relatively
new to it--a year or two), but I did a little poking around to see what
might be causing trouble. Wikipedia had these choice bits to say about
the C3 chip design:
C3
* Because memory performance is the limiting
Timothy J. Massey wrote:
The C3 is slow. I get it. I already *knew* that. However, the
performance numbers I posted demonstrate pretty clearly that the
failure is not in a simple lack of CPU power, but in truly how *much*
CPU power rsync demands. I get triple the performance in
[EMAIL PROTECTED] wrote:
I routinely hit 100% CPU utilization on the Via C3 1GHz Mini-ITX
systems I use as backup servers. I will grant you that the C3 is not
the most efficient processor, but I am definitely CPU-limited. I too
have 512MB RAM, but the machines are not swapping. And that's
You might consider doing a little Perl script rather than shell for the
formatting script. At least that way, you can launch the format command
as a pipe, read its output (the 11/25000 followed by a bunch of ^H
characters to back up over itself), parse it, then output something more
Holger Parplies wrote:
Paul Harmor wrote on 01.01.2007 at 20:51:43 [[BackupPC-users] OK, how about
changing the server's backuppc process niceness?]:
I have only 2 machines (at the moment) being backed up, but every time
the backups start, the server system slows to an UNUSEABLE crawl,
I'm wondering now how to exclude things like /proc globally and per-PC.
You cannot exclude something globally, then exclude more per-PC. The
per-PC settings simply override whatever was set globally, since it's
just setting a perl variable. I suppose you could actually write perl
code
I had this happen to me. In my case, I had an old version of
File::RsyncP. If you go to cpan and type 'install File::RsyncP', it
will tell you if you are up to date or not. The older protocol (v.66 I
think) had a bug in it. I recall the new version is v.68 or v.69.
Adjusting the timeout
You may need to stop/restart the rsyncd service to make it read the
rsyncd.conf file on windows.
I wanted to mention that there was some bug that I ran into (you're not
seeing it yet) when the backuppc was using protocol version 26 and
windows running rsyncd. You might want to update the
seems to be crap (no
/etc/init.d/rsyncd) and ps aux shows nothing more than bash running,
how can I be certain rsyncd is stopped?
I had updated the server to File-RsyncP-0.68 when I started this last
week. I think that is current enough?
Thanks again,
Jim
On 12/14/06, Jason Hughes [EMAIL
scartomail wrote:
Hy Evryone,
Let say I got this in my rsyncd.conf on my windows box:
[Foto]
path = E:/FOTO
Is there anyway to add more directory's to the path variable?
It wouldn't make much sense to do that, because multiple paths would
then need to be merged as a single view
For what it's worth, I started with 2.1.2pl2 stable and had per-machine
configs for each machine working fine. When I installed the 3.0.0beta3
as an upgrade OVER the existing install, it worked fine. Maybe there's
something different about the install script that differs between
upgrade and
I have a Win2k box running the rsyncd package. It is over an 802.11g
link (about 1MB/s throughput when copying via windows shares manually,
but over rsync it's getting closer to 350k). Thus it takes about 40 or
so hours to backup the system. I've taken to excluding tons of stuff
just to get
Fabio,
Usually, when it complains that Apache can't connect to BackupPC, it's
because BackupPC isn't running. You can log in as root, then do
'service backuppc restart' and see what it does. It should shut down
first, then start. I expect it to say 'failed' when trying to shut
down,
You can do this:
cpan
install Compress::Zlib
It should either fetch and compile the perl module, or tell you that it
is already up to date.
JH
Ariyanto Dewo wrote:
Hi all,
Thanks for the respond my last message 'backup
backuppc', I am able to
figure it out to works. But now I have a
I had no luck with getting rsyncd on Windows to work with spaces in
filenames through the config file. I resorted to using the 8.3
filenames instead, ie:
secrets file = c:/Progra~1/rsyncd/rsyncd.secrets
The easy way to find them is to do a dir /x and you get both the long
and short names.
Are you missing a double-quote on the AuthName line? That might confuse
the parser, causing who knows what problems.
JH
Krsnendu dasa wrote:
AuthName BackupPC
-
Take Surveys. Earn Cash. Influence the Future of IT
change the command from
/usr/bin/smbclient gandolf\\C$ -U backup to /usr/bin/smbclient
gandolf\\Eric -U backup it connects to the Eric folder just fine.
I don't know that much about sharing and networking but it's like the
default share on C is messed up.
What next?
Jason Hughes wrote
Craig Barratt wrote:
Jason writes:
Since I finally got 2.1.2pl2 working, I decided to upgrade to 3.0.0beta2
(a glutton for punishment, I am). Everything went swimmingly until I
tried to look at any logs or view the config files either for clients or
the general system, via the CGI
Since I finally got 2.1.2pl2 working, I decided to upgrade to 3.0.0beta2
(a glutton for punishment, I am). Everything went swimmingly until I
tried to look at any logs or view the config files either for clients or
the general system, via the CGI interface.
Here's what I get from the web
scartomail wrote:
I'm can not login with a normal user.
The only user I can login with at http://backuppc-server/backuppc/ is
backuppc ?
[...]
Still unable to logon to backuppc.
I did notice the file /etc/backuppc/htpasswd in witch only the user
backuppc is mentioned.
Here might be the
Make sure you have set an admin user to be the user name that should
have complete access to BackupPC from the CGI:
$Conf{CgiAdminUsers} = 'panther';
Without this set, anyone you log in as is only a user, and can only see
the machines that the hosts file declares to be associated with that
I did get it to transfer 10gb of the 12gb file manually using
smbclient. For whatever reason, I guess there was a 20 second gap in
the transfer there and it timed out. I had to shut down smbclient,
then open it again to establish a good connection to the server, and
I'm using 'reget' to get
Craig Barratt wrote:
Jason writes:
This took 40 hours to run, and backed up a lot, but when it got to a 12gb file, it choked.
Here's the XferLog Errors:
Error reading file \video\2005\video2005raw.avi : Call timed out: server did not respond after 2 milliseconds
Didn't
Hi all. Nobody has responded to my other messages requesting help, so
I'm trying again. I'm using the 2.1.2 version.
I have one Windows machine that is backing up flawlessly (other than
NT_SHARING_VIOLATIONs that are unavoidable). I have another that is
failing when it gets to a very large
Les Stott wrote:
I thought maybe excluding that particular file would help, but exclusions aren't
working well for me. I tried to exclude like this:
$Conf{BackupFilesExclude} =
['Documents and Settings/Administrator/Local Settings/Temporary Internet Files/*'];
And it backed
I'm having trouble getting backups to work with rsync. I have two
hosts using smb that are working (sort of), and two with rsync that are
not. Here's the log file I get (the machine name is 'sol'):
Contents of file /var/backuproot/pc/sol/LOG, modified
2006-11-01 12:21:42
2006-11-01 12:21:42
Yes. I did:
[EMAIL PROTECTED] ~]$ ssh [EMAIL PROTECTED] echo \$USER
root
JH
Les Mikesell wrote:
On Thu, 2006-11-02 at 15:24, Byron Trimble wrote:
I'm using 'rsync' and I have setup all the ssh keys and tested them.
Did you test them running as the backuppc user on the
Hi all,
I've just got backuppc set up for the first time on a Centos 4.4 box
using the provided RPMs, with mod_perl. It was a real challenge because
it seems somehow to be using mod_perl2, whereas Centos only has
1.999xxx. Very confusing. At any rate, it's working with Apache
running as
66 matches
Mail list logo