On Fri, Dec 15, 2006 at 04:21:11PM +0100, Guus Houtzager wrote:
Usually if it's hardware related, the memory is the culprit. Try a
memtest86+ run and see if that shows any errors.
You might want to use the Dell tools to check the memory sind the Dell
guys usually want it's results before they
On Mon, Dec 18, 2006 at 09:48:13PM +0200, Ashley Shaw wrote:
The first link on the BackupPC site advises that you test as the BackupPC
user on the server to verify if it works, I did as follows:
---
[EMAIL
On Tue, Dec 19, 2006 at 08:08:46AM -0600, Carl Wilhelm Soderstrom wrote:
BackupPC: Host Summary
This status was generated at 19/12 15:04.
Could we have a setting in there to refresh that browser page every x
seconds?
I would suggest that you should just do it by hand whenever you
On Tue, Dec 19, 2006 at 08:56:57AM -0800, dbp lists wrote:
if I use rsync instead of smb, will new files be backed up during
incremental backup.
Yes, of course. rsync is just the transfer method.
I'm not sure about your second question, though. In fact, a full or
incremental backup doesn't
On Thu, Dec 21, 2006 at 03:37:30PM -0500, Troy Davis wrote:
I've been using BackupPC for quite a while, and recently started
running low on disk space. We've got a 2TB NAS device that I'd like
to use as a destination for BackupPC's data. I've got it setup to
mount on boot via CIFS in
On Wed, Dec 27, 2006 at 11:04:06AM +0100, Diaz Rodriguez, Eduardo wrote:
I allways move my pools using super complex procedure
stop backuppc
cp -a
start backuppc
Be sure that cp is a GNU cp or at least handles hardlinks correctly -
otherwise you'll multiple your storage requirements.
Bye,
On Wed, Dec 27, 2006 at 12:42:38PM +0100, Diaz Rodriguez, Eduardo wrote:
Only check that the size in the new file system has the same size that
the old system. are this correct?
Yes, you would see a significant increase in used space on the new file
system if the copying went wrong and didn't
On Tue, Jan 02, 2007 at 04:35:18PM +0100, Alessandro Ferrari wrote:
is It possible to backup all PCs of a MS-Domain with smb-backuppc
mode? (NOT a MS-Workgroup but Domain!?)
There should be no difference between a workgroup and a domain from
BackupPC's point of view - it needs to be able to
On Fri, Jan 05, 2007 at 03:45:43PM +0100, Matthias Bertschy wrote:
For the last 4 weeks, I have been doing everything I could to try to
move our current backuppc pool (Pool is 79.18GB comprising 1132632 files
and 4369 directories) from a RAID0 to a RAID5 having different
filesystem sizes.
BackupPC at Chemnitzer Linux Tage in March.
http://chemnitzer.linux-tage.de/2007/vortraege/detail.html?idx=614
--
www.quantenfeuerwerk.de
www.spiritualdesign-chemnitz.de
www.lebensraum11.de
Tino Schwarze * Parkstraße 17h * 09120 Chemnitz
.
--
www.quantenfeuerwerk.de
www.spiritualdesign-chemnitz.de
www.lebensraum11.de
Tino Schwarze * Parkstraße 17h * 09120 Chemnitz
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance
and db dumps. You might
have to wait for the next run of BackupPC_nightly so that the files are
removed from the pool.
HTH,
Tino.
--
www.quantenfeuerwerk.de
www.spiritualdesign-chemnitz.de
www.lebensraum11.de
Tino Schwarze * Parkstraße 17h * 09120 Chemnitz
anybody done something similar before?
ByeThanks,
Tino.
--
www.craniosacralzentrum.de
www.spiritualdesign-chemnitz.de
www.forteego.de
Tino Schwarze * Lortzingstraße 21 * 09119 Chemnitz
-
Check out the new SourceForge.net
-chemnitz.de
www.forteego.de
Tino Schwarze * Lortzingstraße 21 * 09119 Chemnitz
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http
to
tape from time to time without exceeding tape capacity nor backup time
contraints.
Hm, while thinking this through it looks like it' the easiest to
schedule host archiving manually...
Bye,
Tino.
--
www.craniosacralzentrum.de
www.spiritualdesign-chemnitz.de
www.forteego.de
Tino Schwarze
need at least
kernel 2.6.13.
--
www.craniosacralzentrum.de
www.spiritualdesign-chemnitz.de
www.forteego.de
Tino Schwarze * Lortzingstraße 21 * 09119 Chemnitz
-
Check out the new SourceForge.net Marketplace.
It's the best
. :-|
HTH,
Tino.
--
www.craniosacralzentrum.de
www.spiritualdesign-chemnitz.de
www.forteego.de
Tino Schwarze * Lortzingstraße 21 * 09119 Chemnitz
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell
www.forteego.de
Tino Schwarze * Lortzingstraße 21 * 09119 Chemnitz
-
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://ad.doubleclick.net/clk
On Mon, Mar 31, 2008 at 03:07:20AM -0400, kanti wrote:
Hie Mike sorry for late reply . Now when i run /usr/bin/ssh -q -x -l root
scn-ws9 commad i got a like this
The authenticity of host 'scn-ws9 (192.168.1.109)' can't be established.
RSA key fingerprint is
On Mon, Mar 31, 2008 at 10:52:04AM +0200, [EMAIL PROTECTED] wrote:
Hi all,
I've got a question about Sendmail.
I'm trying to configure Sendmail in my company but it doesn't work.
The domain name is valeo.com.
If you want to help me this is my sendmail.cf :
PLEASE can you check if there
On Mon, Mar 31, 2008 at 03:14:34PM +0200, Ronny Aasen wrote:
I would like to know if it's possible and how to configure BackupPC in
order to use a NAS server instead of the path /datas of BackupPC (for
example).
Sure, it's possible to do an NFS mount of a remote disk, and use that
On Mon, Mar 31, 2008 at 02:15:28PM -0700, Kyle Corupe wrote:
I'm seeing this weird error
Fatal error (bad version): OpenSSH_4.3p2, OpenSSL 0.9.8b 04 May 2006
fileListReceive() failed
I lost for an explanation, I'm sure the private keys are working because it
logs in just fine. I have
On Mon, Mar 31, 2008 at 02:34:27PM -0700, Kyle Corupe wrote:
oh I put in the -vv for debugging, it still does it.
What does ssh -q -x -l backuppc goldduck -p 22332 /bin/true output?
It should print absolutely nothing.
Does rsync -e ssh [EMAIL PROTECTED]:/tmp/somefile . work (you might
need a
On Mon, Mar 31, 2008 at 12:57:41PM -0500, Carl Wilhelm Soderstrom wrote:
On 03/31 09:29 , Vince wrote:
I tried setting EMailNotifyMinDays, EMailNotifyOldBackupDays,
EMailNotifyOldOutlookDays all to 0 or -1 to disable it but it has not
worked. I also could not find anything about this in
On Tue, Apr 01, 2008 at 12:21:39PM +, Gilles Guiot wrote:
I'm using backuppc 3.1.0 on a debian distro.
My backup server has two raid 1 arrays
I have been backing up some servers on the first array
I need to backup other servers but not enough space on the first array
/dev/sda1.
I
On Wed, Apr 02, 2008 at 06:46:14AM -0500, Carl Wilhelm Soderstrom wrote:
On 04/02 12:17 , Tino Schwarze wrote:
These are all unix domain sockets and should simply be ignored (whycan't
tar save these like the device nodes in /dev?)
Because sockets are created on the fly; and should only
On Wed, Apr 02, 2008 at 01:11:36PM -0400, Ryan Manikowski wrote:
What steps are necessary to completely remove a host's data from
BackupPC? Will setting FullKeepCnt to '0' for a host remove the host's
backup data from disk?
Simply rm -rf the pc/$host directory, then wait for the next
On Thu, Apr 03, 2008 at 12:14:33PM -0700, Vince wrote:
I have disabled the backups for a system and still want to keep the
files around for a bit. I followed the instructions in the
documentation and it worked fine but I also want to disable the no
backup for system warning email that is
Hi there,
since I've updated to 3.1.0, I don't get any pool statistics any more:
Other info:
* 0 pending backup requests from last scheduled wakeup,
* 0 pending user backup requests,
* 0 pending command requests,
* Pool is 0.00GB comprising 0 files and 4369 directories (as of 4/7
Hi Ashley,
On Mon, Apr 07, 2008 at 11:32:49AM -0700, Ashley Paul James wrote:
I am running backupPC on a ESX VM using an attached (via switch) LaCie
NAS drive mounted using SMB. Being a low cost drive this does not
support NFS or Hardlinks or much else really.
Uh-oh. Poor-man's SAN?
On Mon, Apr 07, 2008 at 01:43:02PM -0700, Ashley Paul James wrote:
Thanks for all the answers there, they all worked. I wasnt getting my
hopes up with BackupPC actually running after the changes, sure enough
it didnt.
However I can now see all the hosts via the GUI. However when I ran
Hi Ashley,
On Mon, Apr 07, 2008 at 02:06:21PM -0700, Ashley Paul James wrote:
Thanks for the quick reply.
.oO(I've been awake for too long already... last mail for today.)
While I was waiting, I played around with
the server a little and saw two mount points for the NAS, so I stopped
Hi,
On Tue, Apr 08, 2008 at 08:50:18AM +0200, [EMAIL PROTECTED] wrote:
I would like to know if it's possible to use BackupPC over SSH with
public/private key with a passphrase (and add this passphrase to the line
command of BackupPC) ?
You gain absolutely nothing by assigning a passphrase to
On Tue, Apr 08, 2008 at 02:43:14PM +0200, Pedro Cambra wrote:
Thanks again Tino, I've stopped the backup.
If i type this command (with - - C)
/usr/bin/ssh -q -x -n -l pcambra 192.168.1.37 env LC_ALL=C sudo
/bin/tar -c -v -f - -C /home/pcambra --totals
I get this error:
/bin/tar:
On Tue, Apr 08, 2008 at 02:05:46PM +0200, Pedro Cambra wrote:
Running: /usr/bin/ssh -q -x -n -l pcambra 192.168.1.37 LC_ALL=C sudo
/bin/tar -c -v -f --C /home/pcambra --totals .
This --C switch looks suspicious to me. It should be just -C.
HTH!
Tino.
--
„Es gibt keinen Weg zum Frieden. Der
On Wed, Apr 09, 2008 at 04:32:13PM +0200, Ludovic Drolez wrote:
I've seen in some commercial backup systems with included
deduplication (which often run under Linux :-), that files are split in
128k or 256k chunks prior to deduplication.
It's nice to improve the deduplication ratio for big
Hi there,
I've got to write about this issue again. Since my upgrade to BackupPC
3.1.0, the BackupPC_nightly job doesn't seem to do anything. The log
looks like this:
2008-04-09 06:00:02 Next wakeup is 2008-04-09 22:00:00
2008-04-09 06:00:11 BackupPC_nightly now running BackupPC_sendEmail
On Thu, Apr 10, 2008 at 10:48:33AM -0400, Rob Owens wrote:
I know I'm coming into this thread way late, but I think this could be
fairly easily managed using a master/slave setup (similar to what MythTV
offers). BackupPC could default to being a master, but you could
optionally set it as
On Fri, Apr 11, 2008 at 11:35:27AM -0400, Rob Morin wrote:
Thanks for that link, but i wanted to know if it is possible that just
backuppc performing the backup of mysql files, if that can cause an
issue with teh files while the server is running, IE like corrupting
.myd files or stuff
On Fri, Apr 11, 2008 at 04:16:58PM -0500, Les Mikesell wrote:
quote_03:
2008-04-11 01:00:36 Pool nightly clean removed 0 files of size 0.00GB
2008-04-11 01:00:36 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0
max links), 1 directories
2008-04-11 01:00:36 Cpool nightly clean
On Sun, Apr 13, 2008 at 11:16:07AM +0200, Bernhard Ott wrote:
PS: I've just started adding debug log messages to BackupPC_nightly. Is
there an easy way to make it run other than changing
$Conf{WakeupSchedule} and waiting an hour?
run as backuppc:
/path/to/bin/BackupPC_serverMesg
On Sun, Apr 13, 2008 at 11:28:37AM +0200, Bernhard Ott wrote:
Daniel Denson wrote:
did you change the $TopDIR entry in config.pl? I have found that the
nope
mechanism for reporting disk usage requires than $TopDIR be
/var/lib/backuppc. its best to mount your target disk onto that
On Sun, Apr 13, 2008 at 12:08:17PM +0200, Tino Schwarze wrote:
BTW, there are certain similarities in our setup:
* 64bit linux
* file system xfs/LVM/RAID
More info from my setup:
* Upgraded from 3.0.0 to 3.1.0 via configure.pl
* Perl version 5.8.8
* IO::Dirent 0.04 (but it's been
On Sun, Apr 13, 2008 at 09:41:29PM +0200, Bernhard Ott wrote:
Wow, I'm deeply impressed. I applied your patch the main problem seems
to be solved! I just found and opendir error in the logs:
Heh, I'm impressed myself - I'm no Perl hacker. B-)
BTW: Is anybody of the developers listening and
On Mon, Apr 14, 2008 at 12:55:22PM -0400, Alexandre Joly wrote:
Has anyone ever managed to add a functionality to archive in zip format
additionally with encryption?
Maybe a slight modification of the BackupPC_archiveHost would be
necessary or is it too complex?
Zip encryption is useless.
On Mon, Apr 14, 2008 at 10:09:57AM +0200, Ludovic Drolez wrote:
How long are you willing to have your backups and restores take? If
you do more processing on the backed up files, you'll take a greater
Not true :
- working with fixed size chunks may improve speed, because algorithms
On Mon, Apr 14, 2008 at 11:21:02AM -0400, Raman Gupta wrote:
I have three hosts configured to backup to my PC. Here are the speeds
from the host summary:
host 1: 24.77 GB, 14,000 files, 18.78 MB/s (slower WAN link)
host 2: 1.27 GB, 4,000 files, 1.89 MB/s (faster WAN link)
host
On Sun, Apr 13, 2008 at 03:08:12PM +0200, Mario Giammarco wrote:
since backuppc is very handy I would like to use it to keep an accurate
history (like cdp or cvs) of each machine day by day.
So I would like to keep 365 day x 10 years backups.
I do not understand if it is possible, nor how
On Tue, Apr 15, 2008 at 10:18:50AM -0400, Alexandre Joly wrote:
Well, I can see WinZip 11 has support for 128- and 256-bit AES
encryption. PKZIP, on the other hand, is weak and can be cracked within
minutes.
My initial question was, anyone ever modified BackupPC in order to
archive in
On Tue, Apr 15, 2008 at 08:26:42AM -0600, Daniel Denson wrote:
I think that this has been hashed over for some time now. It seems that
windows will only handle a 2GB zip file via the CGI interface for no
apparently good reason. You can either do BackupPC_zipCreate on the
backuppc machine
On Wed, Apr 16, 2008 at 08:58:08AM +0100, IMGLE - BackupPC wrote:
I have BackupPC setup and working with Windows and Linux Hosts. I now
want to get the backups offsite and want to setup an archive host. I
want this to be on the same server as the BackupPC server, is this
possible?
On Wed, Apr 16, 2008 at 02:20:36PM +0200, Ludovic Drolez wrote:
Introducing file chunking would introduce a new abstraction layer - a
file would need to be split into chunks and recreated for restore. You
Tino -- thanks for posting this. These issues are exactly what I had
in mind
On Wed, Apr 16, 2008 at 02:52:34PM +0200, Ludovic Drolez wrote:
And what about a mix of the two ?
- keep hard links for files less than the chunk size (filenames begin
with an 'f' as before)
- for files bigger than the chunk size, create a regular file which
contains references to
On Thu, Apr 17, 2008 at 11:53:15AM +0100, Joseph Holland wrote:
Ok, we have BackupPC version 3 installed on a few servers throughout our
company. Recently one of the servers went down. We have been using
rsync to backup the BackupPC'c data directory onto a USB drive. I'm
just wondering is
On Thu, Apr 17, 2008 at 03:10:13PM -0400, Tony Schreiner wrote:
On one of my BackupPC setups, I back a lot of data. On occasion
things run for more than 24 hours and I start getting
Botch on admin job for admin : already in use!!
messages in the log file. I'm guessing that this
On Thu, Apr 17, 2008 at 03:41:53PM -0400, Tony Schreiner wrote:
On one of my BackupPC setups, I back a lot of data. On occasion
things run for more than 24 hours and I start getting
Botch on admin job for admin : already in use!!
messages in the log file. I'm guessing that this
Hi Joseph,
(CC'ing back to the list, BTW: Please don't send HTML mails.)
On Thu, Apr 17, 2008 at 12:13:39PM +0100, Joseph Holland wrote:
We backed up the:br
br
/etc/BackupPCbr
/usr/local/share/BackupPC (install directory, default is
/usr/local/BackupPC)br
/var/data/BackupPC (data
On Fri, Apr 18, 2008 at 11:32:47AM -0400, Tony Schreiner wrote:
I have asked the users to use certain directories for temporary
files, which I don't backup; but users are users as you may know.
I've educated my users the hard way. We've got home directories with
quotas set up. Then we've got
On Sun, Apr 20, 2008 at 03:48:06PM +0200, Felix Andersen wrote:
So for various reasons I am copying my backuppc stuff from the
directory /mnt/oldb/var/lib/backuppc to /mnt/newb/backuppc. The stuff
will later be mounted in the right places and so on.
I ran sudo cp
On Mon, Apr 21, 2008 at 07:30:54AM -0700, David Koski wrote:
I've run into a frustrating problem that affects only one of my backup
clients. This particular client is configured as though it is two
hosts to backuppc, such that part of its contents go to a compressed
pool and the rest
On Mon, Apr 21, 2008 at 08:01:19PM +0200, shacky wrote:
I'm installing BackupPC 3.1.0 on a SuSE 10.0 system.
After I installed it I copied the init.d/suse-backuppc script from the
BackupPC sources directory to /etc/init.d/backuppc and I chmodded it
to 755.
Now if I start it with
On Mon, Apr 21, 2008 at 08:17:46PM +0200, shacky wrote:
What happens if you try su - backuppc?
Nothing special.
I get the normal shell for the backuppc user.
Could you past me the output of the following commands on your working
system, please?
BTW: It's an openSUSE 10.2 (X86-64)
ls
On Mon, Apr 21, 2008 at 08:53:27PM -0600, dan wrote:
just for fun, try to suid backuppc the init script.
SUID doesn't work for shell scripts for security reasons (I don't
remember the exact reason, though).
Tino.
--
„What we resist, persists.” (Zen saying)
www.craniosacralzentrum.de
On Tue, Apr 22, 2008 at 12:22:44AM +0200, shacky wrote:
Everything in bin/ is owned by user backuppc and has rights -r-xr-xr-x.
Yes, I have the same permissions.
The strange thing is that I get the same error even if I try to run
BackupPC directly as backuppc user:
On Tue, Apr 22, 2008 at 12:10:27PM +0200, shacky wrote:
What's that suidperl you talked about? Why did you ask me for it?
What is the first line of your bin/BackupPC?
Sorry, I was wrong.
The first line of my bin/BackupPC is
#!/usr/bin/perl
Is that suid?
Try the following perl
On Tue, Apr 22, 2008 at 12:22:05PM +0200, Tino Schwarze wrote:
On Tue, Apr 22, 2008 at 12:10:27PM +0200, shacky wrote:
What's that suidperl you talked about? Why did you ask me for it?
What is the first line of your bin/BackupPC?
Sorry, I was wrong.
The first line of my bin
On Tue, Apr 22, 2008 at 05:07:52PM +0200, shacky wrote:
- If you're using SMB or CIFS I don't think it will support the hardlinks
that BackupPC requires for data pooling.
- It's going to be slow. Just buy a big cheap local disk.
And connecting the external hard drive to a USB 2.0 port
On Wed, Apr 23, 2008 at 11:21:39PM +0200, Felix Andersen wrote:
I've just started using backuppc. Great job people! However I am having
some problems with charsets. My backuppc system deals with filenames
saved with Swedish characters like Å Ä and Ö. When looking at the file
names in the
Hi Felix,
(CC'ing back to the list)
On Thu, Apr 24, 2008 at 11:32:38PM +0200, Felix Andersen wrote:
AH! That actually seems to be the case. Strange... So i should be safe
if I set that option in the apache2 configuration?
Yes, that should do it. Your file system uses UTF-8 as well as your
On Sat, Apr 26, 2008 at 03:59:51PM +0200, Guido Schmidt wrote:
I logged in manually and managed to sudo rsync. As long as I left out
the options --server and --sender in /etc/sudoers I was able to do some
rsyncs locally on that client, but with these options enabled rsync
seems to just
On Tue, Apr 29, 2008 at 12:11:19AM +0200, [EMAIL PROTECTED] wrote:
since I'm new to this list a short hello to everyone. I'm working for a
small ngo and became something like the backup guy, mainly because I know
more than my co-workers :-/
We have an ubuntu server, mainly for filesharing
On Mon, Apr 28, 2008 at 09:01:05PM -0600, dan wrote:
I am using a product called hamachi that builds a sort of VPN. There is a
version for windows and for linux. the free version allows something like
10 users per private network. they all get an IP address on this network
and you can run
Hi Maudro,
On Tue, Apr 29, 2008 at 08:11:39PM +0200, Mauro Condarelli wrote:
this question should be beaten to death by now, but I'm unable to find a
solution (shame on me!).
I have an ubuntu/linux server with attached storage where I installed
BackupPC via apt.
I used the default
On Wed, Apr 30, 2008 at 11:02:25PM +0800, [EMAIL PROTECTED] wrote:
I have set up BackupPC on Kubuntu as a guest OS sitting on top of
Windows XP using vmWare Server. Thus, my full set up is:
- Windows XP Prof
- vmWare Server
- Kubuntu 8.04 Hardy Heron (as a guest OS in vmWare)
- virtual disk
On Fri, May 02, 2008 at 04:40:04PM -0500, Chris Baker wrote:
I'm considering a new backup server and am wondering what hardware I need
for it. I would like to get one with two 100 Mb network cards so they can be
teamed. (Gb cards are useless as all our switches are 100 Mb.) I'm also
looking
On Fri, May 02, 2008 at 07:46:54AM +1000, Adam Goryachev wrote:
I'm currently using BackupPC to backup few of my servers. I'm only
having a problem with one of the server getting backed up. There's an
rsync process that's been running flat out for a long time. It seems
that BackupPC is
Hi dan,
On Sun, May 04, 2008 at 12:14:41AM -0600, dan wrote:
Thanks for the work. This clears things up.
I did a test on ubuntu 8.04
created a samba share at /root/share
i installed smbfs and then mounted that share via
mount -t smbfs //localhost/share /mnt/sharetest
cd
On Mon, May 05, 2008 at 09:12:50AM -0400, Leandro Tracchia wrote:
well it turns out the terastation does not even support NFS, however it does
support Windows shares. I do have another NAS, a ReadyNAS which does support
NFS so i decided to try this one out and started a full backup job to let
On Mon, May 05, 2008 at 01:00:19PM -0500, Les Mikesell wrote:
(NB: Les and I have gone back and forth on this issue before, and I don't
sense any hostility involved; we just have different experiences and
requirements, so we disagree firmly but politely).
BTW: This statement made me laugh.
On Mon, May 05, 2008 at 01:31:10PM -0500, Les Mikesell wrote:
But I also had an SW RAID with some mysterious missing member.
Supposedly, I did something wrong, but after all, I couldn't use mdadm
for monitoring any more because it always complained although there was
no problem.
A
On Mon, May 05, 2008 at 10:48:03AM -0500, Les Mikesell wrote:
Yes, I agree with you. In my case it's not really necessary but I think
it's a good idea.
Will it be in BackupPC 3.1.1 ?
It can't be done when you use basic http authentication. There is no
way to tell a browser to forget
On Mon, May 05, 2008 at 03:11:46PM -0400, Leandro Tracchia wrote:
its been set at 72000 (20 hrs). is that really too low???
Depends on your data set. (But it shouldn't have stopped after 5 hours
then.) I've got a rather slow server with about 3 million files. It takes
about 30 hours to back it
Hi Leandro,
On Tue, May 06, 2008 at 08:27:20AM -0400, Leandro Tracchia wrote:
thanks for your reply. like i said before, after the backup crashed it saved
the partial dump and continued on the next wakeup schedule. that third time
around the backup finished successfully because the next 2
On Fri, May 02, 2008 at 05:08:42PM +1000, Nick Triantafillou wrote:
The hostname of one of my clients machines has changed, is there a
simple way to rename this in backuppc?
Changing the hosts file is trivial, i'm wondering if I can just rename
the pc/hostname folder to the new host and
Hi Chris,
On Tue, May 06, 2008 at 02:55:27PM -0400, [EMAIL PROTECTED] wrote:
Thanks, but what about weekly full backups? Would that swamp our remote
512mbps connections?
No. The difference between full and incremental backup (using rsync) is
just that with full backup, some (configurabel)
On Wed, May 07, 2008 at 08:36:29AM -0400, Lee A. Connell wrote:
I have a lot of deployed backuppc environments, and I seem to get these
errors quite a bit on large files. How can I get more information out
of this, like is the file still open, lost network connection, etc... I
keep getting
On Wed, May 07, 2008 at 11:11:12PM +0200, Sam Przyswa wrote:
We have to change our BackupPC server to a new machine, how to copy the
entire BackupPC directory (120Gb) to an other machine ?
I tried rsync, it crash after a long, long time, I tried scp but it
don't pass the link and
On Thu, May 08, 2008 at 02:35:50PM +0200, Sam Przyswa wrote:
We have to change our BackupPC server to a new machine, how to copy
the
entire BackupPC directory (120Gb) to an other machine ?
I tried rsync, it crash after a long, long time, I tried scp but it
don't
On Thu, May 08, 2008 at 09:09:30AM -0400, Leandro Tracchia wrote:
Would I have a problem running backuppc on a 64bit processor with Ubuntu
64bit OS???
IMO times are over when 32bit/64bit was an issue. The stuff has been in
production for several years now. It is mature.
Bye,
Tino.
--
What
On Sun, May 11, 2008 at 01:19:37PM +0200, Sam Przyswa wrote:
It would work like this:
oldserver# dd if=/dev/backuppc-filesystem bs=1M | ssh -c blowfish -C -o
CompressionLevel=9 newserver dd of=/de/newfilesystem bs=1M
(Of course, the filesystem has to be unmounted.) Later, you
Hi Stuart,
On Thu, May 15, 2008 at 11:18:33AM +0100, Stuart Buckell wrote:
#/etc/init.d/backuppc start (using the debian-backuppc script, it's
configured properly, all paths correct).
Error.
Starting backuppc: 2008-05-15 11:16:21 Can't create a test hardlink between
a file in
On Sat, May 17, 2008 at 05:08:32PM +0200, Hendrik Friedel wrote:
The backups I run have a size of 20-30 GB and it takes around 5 hours, even
for an incremental Backup (54 Mbit Wlan), which is quite disappointing in
the fist place (rsync, Client: Centrino 1.7Ghz 1.5 GB Ram, Server Athlon
On Sun, May 18, 2008 at 04:43:42PM +0200, Mester wrote:
Thank you for the fast reply. Can you desribe how this command works?
Just look into the file, there is a description at the top (it's a Perl
script).
Example: Create complete tar file of most recent backup of host $host:
On Mon, May 19, 2008 at 09:28:50AM -0400, Leandro Tracchia wrote:
how to i uncompress the xfer log files.. i've never uncompressed a .z
file before and i've tried gunzip and tar but no luck.
Use BackupPC_zcat
Tino.
--
What we resist, persists. (Zen saying)
www.craniosacralzentrum.de
On Mon, May 19, 2008 at 09:57:03PM +0200, Matthias Meyer wrote:
Hello,
I've renamed my localhost to fileserver.
Thats mean I renamed the directory backuppc/pc/localhost
to backuppc/pc/fileserver and renamed the entry in /etc/backuppc/hosts
too.
Of course I import the new host key for
Hi,
On Wed, May 21, 2008 at 09:36:44AM +0200, Sebastian Perkins wrote:
We are new to backuppc (migrating from an old DAT tape based system to disk
based backup) and we have a question regarding full incr backups.
Our goal is to maintain 2 months of data off samba shares.
We will be
On Wed, May 21, 2008 at 09:07:16AM -0500, Chris Baker wrote:
Is it possible to backup directly to an eSATA or Firewire drive or drive
array?
BackupPC doesn't care how you attach your storage to your server. It
just uses any filesystem which supports hardlinks.
Tino.
--
What we resist,
On Wed, May 21, 2008 at 02:38:10PM -0700, James wrote:
I have a laptop client with 200gb of data to back up. The backup
always ends with a:
Got fatal error during xfer (aborted by signal=PIPE)
Backup aborted by user signal
This is not surprising. The abort happens when the user closes
On Tue, May 27, 2008 at 05:04:58PM -0500, Frederick Reeve wrote:
So my question is, is there a command that can be run that will show the
backup properly in the web interface? Both as in progress and make the log
files?
Quoting an older Mail:
BackupPC_serverMesg backup HOSTIP HOST
On Thu, May 29, 2008 at 02:58:14PM +0200, Daniel Dietz wrote:
Hi I'm planning to install BackupPC but I couldn't figure out if it is
possible to store the backup files on another Server. Because our main File
Server we use for backup is a Windows 2003 R2 Server. So my Question is it
possible
1 - 100 of 308 matches
Mail list logo