Hi Dan,
If you have some hardware lying around I encourage you to test out zfs
dedupe. nexenta core3 alpha2 and opensolaris b129 both have it and it is
very nice.
Hmm, thanks for the report - it's great to hear you were able to get
it working and see some good results.
I've thought about
Hi,
I assume BackupPC_zipCreate read the files from the numbered dump and write
them local into a .zip file. This local .zip file will be transfered to the
destination.
I can't find out where this local .zip file is located.
Is my assumption wrong?
Does BackupPC_zipCreate write the .zip to
Claude Gélinas wrote:
I'm trying to setup the backup of the localhost with backuppc. I already
backup several other linux machine via ssh. I've setuped all them via
running the following command as backuppc user:
ssh-keygen -t dsa
cd .ssh
ssh-copy-id -i id_dsa.pub
What about $Conf{TarIncrArgs}?
That is more interesting (because this is where the '+' might be missing). To
put it more general: if you want to avoid this debugging ping-pong, provide
some relevant information (like your configuration settings and log file
extracts, for example). Even if we
Mester wrote:
What about $Conf{TarIncrArgs}?
That is more interesting (because this is where the '+' might be missing). To
put it more general: if you want to avoid this debugging ping-pong, provide
some relevant information (like your configuration settings and log file
extracts, for
Hi,
I've been doing backups with BackupPC for quite a few years, mainly backing up
2 or 3 servers.
I've recently installed BackupPC at the office and am backing up 6 servers
right now, with room to grow. The current backup (Veritas BackupExec) keeps
around a year of data, which is close to
Gerald Brandt wrote:
How many servers and how much data do you backup using BackupPC?
BackupPC: Host Summary
* This status was generated at 12/23 13:27.
* Pool file system was recently at 52% (12/23 13:26), today's max is
53% (12/23 01:00) and yesterday's max was 54%.
Hosts with
Gerald Brandt wrote:
Hi,
I've been doing backups with BackupPC for quite a few years, mainly
backing up 2 or 3 servers.
I've recently installed BackupPC at the office and am backing up 6
servers right now, with room to grow. The current backup (Veritas
BackupExec) keeps around a year
How well does BackupPC do with backing up Macs? All of the Macs are OS 10 or
higher, which is just a variant of BSD. Has anyone done this?
We just bought a small company in another office that could use a BackupPC
there. They have some Macs.
Chris Baker -- cba...@intera.com
systems
I backup well over 3 TB of data from over 30 sources, workstations and
servers. Fortunately, we have gigabit switches now. Most computers come with
gigabit network ports by default. We use an external drive enclosure which
we switch out weekly off-site connected on an eSATA port. We also two
Also, if you don't require off-site backup, go with an internal drive. Some
of the external drives out there still have Linux issues. None of the
internal drives have those kind of issues.
You can get up 2 TB now in one single drive now for under $200. Western
Digital, Seagate, and Hitachi are
We've been running BackupPC for a few years now..
There are 39 hosts that have been backed up, for a total of:
309 full backups of total size 29338.60GB (prior to pooling and compression),
275 incr backups of total size 473.40GB (prior to pooling and compression).
Other info:
0 pending backup
Hi Chris,
The storage will be iSCSI (over gigabit), and offsite backups will be done via
an archive done once a week. Does that sound sane?
Gerald
- Chris Baker cba...@intera.com wrote:
I backup well over 3 TB of data from over 30 sources, workstations and
servers. Fortunately, we
Chris Baker wrote:
Also, if you don't require off-site backup, go with an internal drive.
Some of the external drives out there still have Linux issues. None of
the internal drives have those kind of issues.
You can get up 2 TB now in one single drive now for under $200. Western
- Les Mikesell lesmikes...@gmail.com wrote:
Chris Baker wrote:
Also, if you don't require off-site backup, go with an internal drive.
Some of the external drives out there still have Linux issues. None of
the internal drives have those kind of issues.
You can get up 2 TB now
Gerald Brandt wrote:
Hi Chris,
The storage will be iSCSI (over gigabit), and offsite backups will be
done via an archive done once a week. Does that sound sane?
What kind of archive are you planning? As is frequently discussed here
it is difficult to copy a large backuppc system in any
- Les Mikesell lesmikes...@gmail.com wrote:
Gerald Brandt wrote:
Hi Chris,
The storage will be iSCSI (over gigabit), and offsite backups will be
done via an archive done once a week. Does that sound sane?
What kind of archive are you planning? As is frequently discussed here
Chris,
It works great for macs. I backup over 1500 workstations (about 8TB of
data total) to 10 backuppc servers and get about a 50% reduction in
data. The one trick is to use xtar on the macs to be sure you get the
resource forks.
cheers,
ski
On 12/23/2009 10:54 AM, Chris Baker wrote:
I've used it to back up Macs. It works quite well. You need to be aware of
any files that might have resource forks because they won't be backed up
properly with the regular rsync client. Apple's client supports resource
forks, but as far as I know, it would have to go to an OS X server as well.
Gerald Brandt wrote:
- Les Mikesell lesmikes...@gmail.com wrote:
Gerald Brandt wrote:
Hi Chris,
The storage will be iSCSI (over gigabit), and offsite backups will be
done via an archive done once a week. Does that sound sane?
What kind of archive are you planning?
- Les Mikesell lesmikes...@gmail.com wrote:
Gerald Brandt wrote:
- Les Mikesell lesmikes...@gmail.com wrote:
Gerald Brandt wrote:
Hi Chris,
The storage will be iSCSI (over gigabit), and offsite backups will be
done via an archive done once a week. Does that
Matthias Meyer wrote:
Claude Gélinas wrote:
I'm trying to setup the backup of the localhost with backuppc. I already
backup several other linux machine via ssh. I've setuped all them via
running the following command as backuppc user:
ssh-keygen -t dsa
cd .ssh
ssh-copy-id -i
Matthias Meyer wrote:
Hi,
I assume BackupPC_zipCreate read the files from the numbered dump and write
them local into a .zip file. This local .zip file will be transfered to the
destination.
I can't find out where this local .zip file is located.
Is my assumption wrong?
Does
Gerald Brandt wrote:
The backuppc archive host setup is (a) somewhat hard to automate, and
(b) gives a complete tar image per host. You not only lose the history
here but you have to store multiple copies of anything redundant. If
you back up many machines storing copies of the
I find in my setup the Macs are backed up very slowly compared to the
other Linux systems. It's so bad that there must be something I can
change. I'm using rsync over ssh all around.
Incremantals of MacBooks Pro take 4-6 hours, compared to 20 minutes to
1 hour for the Linux systems,
hi,
I am currently doing 5 Macs using rsync over ssh and not having any
major issues. I do only backup /Users and not the complete system.
I find BackupPC very convenient to use compared to EMC Networker that we
used previously.
Mark
On 12/23/2009 1:54 PM, Chris Baker wrote:
How well does
We don't swap the hard drives. We leave the drives in the hard-drive
enclosure and swap out the whole drive enclosure. We have two drive
enclosures. I'm sorry for any confusion.
I power down the system for the swap every time. The whole /home directory
is mapped to the enclosure. I don't think
Chris Baker wrote:
We don't swap the hard drives. We leave the drives in the hard-drive
enclosure and swap out the whole drive enclosure. We have two drive
enclosures. I'm sorry for any confusion.
I power down the system for the swap every time. The whole /home directory
is mapped to the
We backup directly to an external drive unit. We map the /home directory to
it. It works quite well.
Chris Baker -- cba...@intera.com
systems administrator
INTERA -- 512-425-2006
_
From: Gerald Brandt [mailto:g...@majentis.com]
Sent: Wednesday, December 23, 2009 2:16 PM
To:
I have not used this product. It does like quite nice. It is a unit that
holds four drives. According to their ad, you can also mix and match drives.
It's also one of the few units that acknowledges the existence of Linux.
http://www.drobo.com/products/drobo.php
Chris Baker -- cba...@intera.com
I do not what resource forks are. We do not have a Mac server and ultimately
do not want one.
Chris Baker -- cba...@intera.com
systems administrator
INTERA -- 512-425-2006
_
From: Michael Barrow [mailto:mich...@michaelbarrow.name]
Sent: Wednesday, December 23, 2009 1:40 PM
To:
Chris Baker wrote:
I have not used this product. It does like quite nice. It is a unit that
holds four drives. According to their ad, you can also mix and match drives.
It's also one of the few units that acknowledges the existence of Linux.
http://www.drobo.com/products/drobo.php
Chris,
Resource forks are where the Mac OS stores key information about a file.
For example if a word file does not have an extension such as .doc,
then then Mac looks in the resource fork to figure out that it is a word
file. For older Mac programs, other items were stored in the resource
Le mercredi 23 décembre 2009 15:20:38, Chris Robertson a écrit :
Matthias Meyer wrote:
Claude Gélinas wrote:
I'm trying to setup the backup of the localhost with backuppc. I already
backup several other linux machine via ssh. I've setuped all them via
running the following command as
Claude Gélinas wrote:
Le mercredi 23 décembre 2009 15:20:38, Chris Robertson a écrit :
Matthias Meyer wrote:
Claude Gélinas wrote:
I'm trying to setup the backup of the localhost with backuppc. I already
backup several other linux machine via ssh. I've setuped all them via
running the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Les Mikesell wrote:
No, it should be the same. Look in the root/.ssh/authorized_keys file to see
if
the ssh-copy-id command put the right thing there. And make sure the file
and
directories above have the right owner/permissions. I've
Le mercredi 23 décembre 2009 21:33:50, Adam Goryachev a écrit :
Les Mikesell wrote:
No, it should be the same. Look in the root/.ssh/authorized_keys file to
see if the ssh-copy-id command put the right thing there. And make sure
the file and directories above have the right
On 12/23/2009 10:06 PM, Claude Gélinas wrote:
Le mercredi 23 décembre 2009 21:33:50, Adam Goryachev a écrit :
Les Mikesell wrote:
No, it should be the same. Look in the root/.ssh/authorized_keys file to
see if the ssh-copy-id command put the right thing there. And make sure
the
Hi,
I tried to setup Deltacopy for a client, where the windows machines have
Deltacopy installed, and i configured the server on each of them with
different shares (read only) with username/password, thinking that i can use
rsyncd as the method in backuppc. Now when i try to connect from the
39 matches
Mail list logo