Re: [BackupPC-users] Backing up BackupPC

2016-07-17 Thread Alessandro Polverini
That was one of my goals, but I discovered that the btrfs volume was 
unable to handle the millions of hard links needed by backuppc...

Too bad.

On 17/07/2016 20:14, Falko Trojahn wrote:
> Am 14.07.2016 um 14:10 schrieb Carl Wilhelm Soderstrom:
>> On 07/14 12:04 , Adam Goryachev wrote:
>>> You could also use dd, assuming that you have some method to ensure a
>>> consistent state throughout the tape backup period.
>>>
>>> 1) Store the backuppc volume on LVM, take a snapshot and stream the
>>> snapshot to tape
>> Tried doing this years ago. In that instance it caused a 4x performance hit.
>> It was better to just stop BackupPC, take a tarball of the BackupPC data
>> pool, then restart BackupPC. 12 hours of interruption in backups was better
>> than taking all weekend to write (hitchingly) to tape.
>>
> Has anybody tried having backuppc pool on btrfs subvolume, using
> btrfs snapshot and send/receive feature to copy to other server/disk?
>


--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC

2016-07-17 Thread Falko Trojahn
Am 14.07.2016 um 14:10 schrieb Carl Wilhelm Soderstrom:
> On 07/14 12:04 , Adam Goryachev wrote:
>> You could also use dd, assuming that you have some method to ensure a
>> consistent state throughout the tape backup period.
>>
>> 1) Store the backuppc volume on LVM, take a snapshot and stream the
>> snapshot to tape
>
> Tried doing this years ago. In that instance it caused a 4x performance hit.
> It was better to just stop BackupPC, take a tarball of the BackupPC data
> pool, then restart BackupPC. 12 hours of interruption in backups was better
> than taking all weekend to write (hitchingly) to tape.
>

Has anybody tried having backuppc pool on btrfs subvolume, using
btrfs snapshot and send/receive feature to copy to other server/disk?


--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC

2016-07-14 Thread Carl Wilhelm Soderstrom
On 07/14 12:04 , Adam Goryachev wrote:
> You could also use dd, assuming that you have some method to ensure a 
> consistent state throughout the tape backup period.
> 
> 1) Store the backuppc volume on LVM, take a snapshot and stream the 
> snapshot to tape

Tried doing this years ago. In that instance it caused a 4x performance hit.
It was better to just stop BackupPC, take a tarball of the BackupPC data
pool, then restart BackupPC. 12 hours of interruption in backups was better
than taking all weekend to write (hitchingly) to tape.

If you're writing to disk elsewhere, and your hardware differs from what I
had back then, the math may be different for you.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC

2016-07-13 Thread Adam Goryachev
On 13/07/16 08:23, Falko Trojahn wrote:
> bpb21 wrote on 11.07.2016 at 20:28:
>> I've got BackupPC running on a CentOS server (working fine with Windows 10 
>> PCs, by the way!).  I'd like to back up, on occasion, the data BackupPC 
>> stores.
>>
>> For other servers, network shares, and CCTV footage, I use LTO-5 Ultrium 
>> tapes and just use tar on the CentOS server connected to the tape drive; 
>> nothing proprietary going on.
>>
>> But, where does BackupPC store it's data?  (I could probably figure that one 
>> out pretty easily.)  More of a question is, how would backing up the pooled 
>> data to an external source work out?  I have approx. 3.8 TB of data before 
>> pooling and compression, approx 1 TB of data after pooling and compression.
>>
>> So, I'd need to plan on 3.8 TB of external storage were I to back up 
>> BackupPC's data, correct?
>>
>> If I just used the regular tar commands to back up the data directory for 
>> BackupPC, would it be able to preserve the user permissions?  As in, could I 
>> still tell what came from what PC if I just copied the data directory?
>>
>> (I'm probably making this more complex than it is...)
>>
> you have several possibilities:
>
> - use the archivehost feature, so last backup of each host can be saved
> to tape or e.g. destination directory on e.g. external usb drive for
> offline storage
>
> - use rsync to sync the whole pool (usually /var/lib/backuppc) to other
> hard disk, with special parameters it's possible even over ssh
> we do our sync of about 2TB in one  and a half day over 1gb ethernet
> to remote location; if possible, do initial sync locally.
> backuppc service must be shutdown during the sync or at least not
> doing backups, for consistency.
>
> - instead of rsync, if backuppc pool is on btrfs, one could use btrfs'
> send-receive feature, or e.g. btrbk. didn't try that out, though.

You could also use dd, assuming that you have some method to ensure a 
consistent state throughout the tape backup period.

1) Store the backuppc volume on LVM, take a snapshot and stream the 
snapshot to tape
2) Unmount the backuppc volume, stream to tape, remount the volume
etc...

Although I'm not sure that really is a smart idea, I've never tried it, 
but just thought I'd suggest it, someone else with more experience of 
tapes might be able to comment.

PS, dd will consume the space on tape equal to the filesystem capacity

Regards,
Adam
-- 
Adam Goryachev Website Managers www.websitemanagers.com.au

--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backing up BackupPC

2016-07-13 Thread bpb21
Thanks for the reply - very informative!  After looking it all over, I'm 
strongly in the keep it simple camp.  Since I'm running BackupPC on a pretty 
low spec server, and I've got room for one more, I think option 1 you presented 
is my best bet: 1) Run two BackupPC servers and have both back up the hosts 
directly.

With this method I could continue to use low spec (read: cheap) servers and 
alternate the backup days between the servers, giving time for more thorough 
virus scanning of the backed up files.

Or maybe I could just spin up a virtual machine and back up the vm file like I 
do with the other virtual machines.

Either way, good point on the hard links.  Thanks for the info; that does 
answer my question!

+--
|This was sent by bp...@hotmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC

2016-07-13 Thread Falko Trojahn
bpb21 wrote on 11.07.2016 at 20:28:
> I've got BackupPC running on a CentOS server (working fine with Windows 10 
> PCs, by the way!).  I'd like to back up, on occasion, the data BackupPC 
> stores.
>
> For other servers, network shares, and CCTV footage, I use LTO-5 Ultrium 
> tapes and just use tar on the CentOS server connected to the tape drive; 
> nothing proprietary going on.
>
> But, where does BackupPC store it's data?  (I could probably figure that one 
> out pretty easily.)  More of a question is, how would backing up the pooled 
> data to an external source work out?  I have approx. 3.8 TB of data before 
> pooling and compression, approx 1 TB of data after pooling and compression.
>
> So, I'd need to plan on 3.8 TB of external storage were I to back up 
> BackupPC's data, correct?
>
> If I just used the regular tar commands to back up the data directory for 
> BackupPC, would it be able to preserve the user permissions?  As in, could I 
> still tell what came from what PC if I just copied the data directory?
>
> (I'm probably making this more complex than it is...)
>

you have several possibilities:

- use the archivehost feature, so last backup of each host can be saved 
to tape or e.g. destination directory on e.g. external usb drive for 
offline storage

- use rsync to sync the whole pool (usually /var/lib/backuppc) to other 
hard disk, with special parameters it's possible even over ssh
   we do our sync of about 2TB in one  and a half day over 1gb ethernet
   to remote location; if possible, do initial sync locally.
   backuppc service must be shutdown during the sync or at least not
   doing backups, for consistency.

- instead of rsync, if backuppc pool is on btrfs, one could use btrfs'
   send-receive feature, or e.g. btrbk. didn't try that out, though.

HTH
Falko



--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC

2016-07-12 Thread Kris Lou
>
> The problem that you have to deal with is BackupPC's reliance on hard
links within the pool -- and this is supposedly dealt with in 4.x (?).

To resurrect some old discussions (
https://sourceforge.net/p/backuppc/mailman/message/27105491/):


> > * For most people, rsync does not work to replicate a backup server
> > effectively. Period. I think *no* one would suggest this as a reliable
> > ongoing method of replicating a BackupPC server. Ever.
> >
> > * The best methods for this boil down to two camps:
> > 1) Run two BackupPC servers and have both back up the hosts
> > directly
> > No replication at all: it just works.
> > 2) Use some sort of block-based method of replicating the data
> >
> > * Block-based replication boils down to two methods
> > 1) Use md or dm to create a RAID-1 array and rotate members of
> > this array in and out
> > 2) Use LVM to create snapshots of partitions and dd the partition
> > to a different drive
> > (I guess 3) Stop BackupPC long enough to do a dd of the partition
> > *without* lVM)
> >
> I think there is a 3rd camp:
> 3. Scripts that understand the special structure of the pool and pc
> trees and efficiently create lists of all hard links in pc
> directory.
> a] BackupPC_tarPCCOPY
> Included in standard BackupPC installations. It uses a perl
> script to recurse through the pc directory, calculate (and
> cache if you have enough memory) the file name md5sums and
> then uses that to create a tar-formatted file of the hard
> links that need to be created. This routine has been
> well-tested at least on smaller systems.
> b] BackupPC_copyPcPool
> Perl script that I recently wrote that should be significantly
> faster than [a], particularly on machines with low memory
> and/or slower cpus. This script creates a new temporary
> inode-number indexed pool to allow direct lookup of links and
> avoid having to calculate and check file name md5sums. The
> pool is then rsynced (without hard links -- i.e. no -H flag)
> and then the restore script is run to recreate the hard
> links. I recently used this to successfully copy over a pool of
> almost 1 million files and a pc tree of about 10 million files.
> See the recent archives to retrieve a copy.


Also,
http://backuppc.sourceforge.net/faq/limitations.html#some_tape_backup_systems_aren_t_smart_about_hard_links
:

Some tape backup systems aren't smart about hard links
> If you backup the BackupPC pool to tape you need to make sure that the
> tape backup system is smart about hard links. For example, if you simply
> try to tar the BackupPC pool to tape you will backup a lot more data than
> is necessary.
> Using the example at the start of the installation section, 65 hosts are
> backed up with each full backup averaging 3.2GB. Storing one full backup
> and two incremental backups per laptop is around 240GB of raw data. But
> because of the pooling of identical files, only 87GB is used (with
> compression the total is lower). If you run du or tar on the data
> directory, there will appear to be 240GB of data, plus the size of the pool
> (around 87GB), or 327GB total.
> If your tape backup system is not smart about hard links an alternative is
> to periodically backup just the last successful backup for each host to
> tape. Another alternative is to do a low-level dump of the pool file system
> (ie: /dev/hda1 or similar) using dump(1).
> Supporting more efficient tape backup is an area for further development.


I think that this answers your questions about tape requirements if
directly tar'ing the pool.  You might be better off just scheduling host
archives periodically.

I don't know if Jeffrey Kosowsky still monitors the list, but somebody
might have a copy of his scripts (3b, above).  Unfortunately, these were
part of the original BackupPC Wiki, which is no longer available.


Kris Lou
k...@themusiclink.net
--
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are 
consuming the most bandwidth. Provides multi-vendor support for NetFlow, 
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backing up BackupPC

2016-07-11 Thread bpb21
I've got BackupPC running on a CentOS server (working fine with Windows 10 PCs, 
by the way!).  I'd like to back up, on occasion, the data BackupPC stores.

For other servers, network shares, and CCTV footage, I use LTO-5 Ultrium tapes 
and just use tar on the CentOS server connected to the tape drive; nothing 
proprietary going on.

But, where does BackupPC store it's data?  (I could probably figure that one 
out pretty easily.)  More of a question is, how would backing up the pooled 
data to an external source work out?  I have approx. 3.8 TB of data before 
pooling and compression, approx 1 TB of data after pooling and compression.

So, I'd need to plan on 3.8 TB of external storage were I to back up BackupPC's 
data, correct?

If I just used the regular tar commands to back up the data directory for 
BackupPC, would it be able to preserve the user permissions?  As in, could I 
still tell what came from what PC if I just copied the data directory?

(I'm probably making this more complex than it is...)

+--
|This was sent by bp...@hotmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
Attend Shape: An AT Tech Expo July 15-16. Meet us at AT Park in San
Francisco, CA to explore cutting-edge tech and listen to tech luminaries
present their vision of the future. This family event has something for
everyone, including kids. Get more information and register today.
http://sdm.link/attshape
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backing up BackupPC pool For Archival

2015-01-15 Thread Moeen Chowdhary
Hi,

 

I have been using this solution for a while (6 months) to mainly backup
windows file servers and PCs via SMB.  That's working more or less like
it should.  

 

Now I would like to determine what's the best method of backing up the
backups themselves for long-term archival.  I realize that I can just
configure backuppc to keep more full backups for a longer period of
time, however that's not ideal especially since the archives are not
changing and only rarely need to be restored, and also I need a way to
store it on removable media offsite.

 

Does anyone have any suggestions on the best method to accomplish this?

 

I would assume that we could just copy the /var/lib/backuppc/ folder and
all of its contents to an external drive, but then what would the
restore method be?  To restore, would I copy the files back to the live
backup server? Or an alternate server?   Is there another means of
permanent archival method that's built into backuppc itself?

 

Thanks,

Moeen.

--
New Year. New Location. New Benefits. New Data Center in Ashburn, VA.
GigeNET is offering a free month of service with a new server in Ashburn.
Choose from 2 high performing configs, both with 100TB of bandwidth.
Higher redundancy.Lower latency.Increased capacity.Completely compliant.
http://p.sf.net/sfu/gigenet___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up BackupPC pool For Archival

2015-01-15 Thread Les Mikesell
On Thu, Jan 15, 2015 at 4:09 PM, Moeen Chowdhary
mchowdh...@wellplan.com wrote:

 Now I would like to determine what’s the best method of backing up the
 backups themselves for long-term archival.  I realize that I can just
 configure backuppc to keep more full backups for a longer period of time,
 however that’s not ideal especially since the archives are not changing and
 only rarely need to be restored, and also I need a way to store it on
 removable media offsite.

Note that additional backups of things that don't change consume very
little additional space so you might as well keep them as long as you
think you might need to restore something.

 I would assume that we could just copy the /var/lib/backuppc/ folder and all
 of its contents to an external drive, but then what would the restore method
 be?  To restore, would I copy the files back to the live backup server? Or
 an alternate server?   Is there another means of permanent archival method
 that’s built into backuppc itself?

Look at the 'archive host' concept:
http://backuppc.sourceforge.net/faq/BackupPC.html#Configuring-an-Archive-Host
This lets you pre-configure a directory and some options to create tar
archives that may be compressed/split, and you can subsequently start
runs from the web interface.   Or, you can use the BackupPC_tarCreate
command line tool to generate the archive and do whatever you want
with the output.  For one-off runs you can just do a restore from a
browser, downloading as a tar or zip onto the removable media,
although some browsers may download to some other tmp location first,
requiring a lot of space there.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
New Year. New Location. New Benefits. New Data Center in Ashburn, VA.
GigeNET is offering a free month of service with a new server in Ashburn.
Choose from 2 high performing configs, both with 100TB of bandwidth.
Higher redundancy.Lower latency.Increased capacity.Completely compliant.
http://p.sf.net/sfu/gigenet
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing-up BackupPC ?

2010-09-14 Thread Timothy Murphy
Les Mikesell wrote:

 When you are using the web interface to browse the backups you have the
 option to select a directory or collection of files and download it as a
 tar archive. You can do that from the same or a different computer and
 download to that
 external drive.

Thanks as ever for your valuable advice.

Tim

-- 
Timothy Murphy  
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland

--
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backing-up BackupPC ?

2010-09-13 Thread Timothy Murphy
I backup (with BackupPC) onto a partition /BackupPC on machine A.
I backup one directory on machine A, and another directory on machine B.

I recently received a smartd warning about machine A:
-
Device: /dev/sdb, 1 Currently unreadable (pending) sectors
-
This disk (sdb) is actually the drive on which I backup,
though there are 7 partitions on the disk,
and I don't know if the problem is in the /BackupPC partition.
(I don't know how to determine with smartctl
exactly where the problem is.)

I have a large (1TB) USB external drive
I would like to use to backup my BackupPC data,
in case drive sdb actually packs up.

I've looked quickly at the BackupPC documentation,
in particular the section on archiving,
but I am not entirely clear of the procedure,
and would be very grateful for precise instructions.

I'm also not entirely clear whether if I added
a further internal drive /dev/sdc/ to my machine
whether it would be relatively easy
to transfer the BackupPC data from the USB disk to the new disk.

Any suggestions or advice gratefully received.



-- 
Timothy Murphy  
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland


--
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing
http://p.sf.net/sfu/novell-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing-up BackupPC ?

2010-09-13 Thread Les Mikesell
On 9/13/2010 7:32 AM, Timothy Murphy wrote:
 I backup (with BackupPC) onto a partition /BackupPC on machine A.
 I backup one directory on machine A, and another directory on machine B.

 I recently received a smartd warning about machine A:
 -
 Device: /dev/sdb, 1 Currently unreadable (pending) sectors
 -
 This disk (sdb) is actually the drive on which I backup,
 though there are 7 partitions on the disk,
 and I don't know if the problem is in the /BackupPC partition.
 (I don't know how to determine with smartctl
 exactly where the problem is.)

 I have a large (1TB) USB external drive
 I would like to use to backup my BackupPC data,
 in case drive sdb actually packs up.

 I've looked quickly at the BackupPC documentation,
 in particular the section on archiving,
 but I am not entirely clear of the procedure,
 and would be very grateful for precise instructions.

 I'm also not entirely clear whether if I added
 a further internal drive /dev/sdc/ to my machine
 whether it would be relatively easy
 to transfer the BackupPC data from the USB disk to the new disk.


If you don't expect to need to restore from your old copies, the easiest 
thing would be to just add a new internal drive mounted wherever the old 
partition was and start over with new backups, keeping the old drive 
around just in case until you have sufficient new history on the new 
drive.  It may take a very long time to copy the old data and there's 
some chance it would fail during the copy anyway.  If there is anything 
particularly valuable in the old backups that doesn't still exist on the 
source directories, you could do a browser download, saving to the 
external drive.

-- 
   Les Mikesell
lesmikes...@gmail.com




--
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing
http://p.sf.net/sfu/novell-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing-up BackupPC ?

2010-09-13 Thread Timothy Murphy
Les Mikesell wrote:

 I have a large (1TB) USB external drive
 I would like to use to backup my BackupPC data,
 in case drive sdb actually packs up.

 I've looked quickly at the BackupPC documentation,
 in particular the section on archiving,
 but I am not entirely clear of the procedure,
 and would be very grateful for precise instructions.

 If you don't expect to need to restore from your old copies, the easiest
 thing would be to just add a new internal drive mounted wherever the old
 partition was and start over with new backups, keeping the old drive
 around just in case until you have sufficient new history on the new
 drive.  It may take a very long time to copy the old data and there's
 some chance it would fail during the copy anyway.  If there is anything
 particularly valuable in the old backups that doesn't still exist on the
 source directories, you could do a browser download, saving to the
 external drive.

Thanks for your response.
I'm not quite sure what you mean by a browser download.

Actually one of the directories I am backing up with BackupPC
is on the same possibly bad disk.
I guess it is better to back up the source first to another drive,
as I think you are suggesting.

I have ordered a new internal disk, which should arrive in a couple of days,
and I'll follow your suggestion.
I currently have a partition mounted as /BackupPC .
Can I just mount a partition on the new drive as /BackupPC instead?
Doesn't BackupPC expect to find the old backups in place?




-- 
Timothy Murphy  
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland


--
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing-up BackupPC ?

2010-09-13 Thread Les Mikesell
On 9/13/10 8:05 PM, Timothy Murphy wrote:
 Les Mikesell wrote:

 I have a large (1TB) USB external drive
 I would like to use to backup my BackupPC data,
 in case drive sdb actually packs up.

 I've looked quickly at the BackupPC documentation,
 in particular the section on archiving,
 but I am not entirely clear of the procedure,
 and would be very grateful for precise instructions.

 If you don't expect to need to restore from your old copies, the easiest
 thing would be to just add a new internal drive mounted wherever the old
 partition was and start over with new backups, keeping the old drive
 around just in case until you have sufficient new history on the new
 drive.  It may take a very long time to copy the old data and there's
 some chance it would fail during the copy anyway.  If there is anything
 particularly valuable in the old backups that doesn't still exist on the
 source directories, you could do a browser download, saving to the
 external drive.

 Thanks for your response.
 I'm not quite sure what you mean by a browser download.

When you are using the web interface to browse the backups you have the option 
to select a directory or collection of files and download it as a tar archive. 
You can do that from the same or a different computer and download to that 
external drive.  There is a way to do that from the command line too, but it is 
easier to see what you are doing in the browser.

 Actually one of the directories I am backing up with BackupPC
 is on the same possibly bad disk.
 I guess it is better to back up the source first to another drive,
 as I think you are suggesting.

Either that or get a copy back as described above.  Or both.

 I have ordered a new internal disk, which should arrive in a couple of days,
 and I'll follow your suggestion.
 I currently have a partition mounted as /BackupPC .
 Can I just mount a partition on the new drive as /BackupPC instead?

You can if you don't ever expect to need the whole drive for backups. 
Backuppc's archive has to be on a single filesystem due to the way pooling 
works 
so it is usually best to give it as much space as you can.

 Doesn't BackupPC expect to find the old backups in place?

Depending on the layout your distribution had, you might need to make some of 
the subdirectories and copy over any config or perl files that might have been 
there, but if the directories under pc or cpool aren't there it should make new 
ones and start over.  I'd mount the old directory somewhere else so you have 
access to anything that might be missing and watch for error messages when you 
start.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backing up BackupPC server

2009-11-03 Thread hga


iarteaga wrote:
 Hello,
 
 I am testing BackupPC in a CentOS box and it is working great ... but now I 
 want to backup the BackupPC server itself. I am using an external drive as 
 backup storage and I want to schedule a backup for the BackupPC system and 
 config files. Is this possible? if so, how can I configure the path to the 
 files to backup if the server and client would be in the same machine
 

Yes, and here's a localhost.pc that's supplied by the current stable release of 
Debian that uses tar to back up your /etc as the backuppc user.  For my 
BackupPC server I extended it to call a shell script to run tar as root using 
sudo and to call pre and post commands to set up my filesystems as LVM 
snapshots.

#
# Local server backup of /etc as user backuppc
#
$Conf#123;XferMethod#125; = 'tar';

$Conf#123;TarShareName#125; = #91;'/etc'#93;;

$Conf#123;TarClientCmd#125; = '/usr/bin/env LC_ALL=C $tarPath -c -v -f - -C 
$shareName'
nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; nbsp; 
nbsp; . ' --totals';

# remove extra shell escapes #40;$fileList+ etc.#41; that are
# needed for remote backups but may break local ones
$Conf#123;TarFullArgs#125; = '$fileList';
$Conf#123;TarIncrArgs#125; = '--newer=$incrDate $fileList';

Works quite well, and since I use variations of rsync to do alternate backups 
of my server I felt it best to use tar, the extra bytes transmitted don't 
matter much when they're all on one host.

- Harold

+--
|This was sent by backupcent...@ancell-ent.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up BackupPC server

2009-11-03 Thread Ivan Arteaga
Mirco Piccin wrote:
 Hi,

  I want to schedule a backup for the BackupPC system and config 
 files. Is this possible?

 default installation of BackupPC provides also localhost backup.
 Localhost backup means of course BackupPC system and config files.
 Or, if it is not avalilable, you can add it.

 Usually BackupPC system files are stored here:
 /usr/share/backuppc
 , and config files are stored here:
 /etc/backuppc

  how can I configure the path to the files to backup if the server 
 and client would be in the same machine

 All backup files are stored in the $Conf{TopDir} variable defined in 
 /etc/backuppc/config.pl http://config.pl file (and also in web gui - 
 server configuration section).

 You will find localhost backup in $Conf{TopDir}/pc/localhost/X (where 
 X is the backup number).

 Hope this helps you
 Regards
 M

 

 --
 Come build with us! The BlackBerry(R) Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9 - 12, 2009. Register now!
 http://p.sf.net/sfu/devconference
 

 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
   
Hello,

I did configure the backup for one share I have in the server using 
samba as /Xfer method/ but I am getting this message when the backup starts:

2009-11-04 00:31:21 Got fatal error during xfer (tree connect failed: 
NT_STATUS_BAD_NETWORK_NAME)
2009-11-04 00:31:26 Backup aborted (tree connect failed: 
NT_STATUS_BAD_NETWORK_NAME)

This is the config file for the localhost in /etc/BackupPC/pc/ccnt1.pl 
(ccnt1 is the server name where backuppc is running) I have no Xfer nor 
smbShare configuration in the main /etc/BackupPC/config.pl

[r...@server]# more /etc/BackupPC/pc/ccnt1.pl
$Conf{SmbShareName} = [
  '/cose'
];
$Conf{SmbSharePasswd} = 'xxx';
$Conf{SmbShareUserName} = 'administrator';
[r...@server]#

This is the output from the XferLog.bad.z file:

Running: /usr/bin/smbclient ccnt1\\/cose -U administrator -E -N -d 1 -c 
tarmode\ full -Tc -
full backup started for share /cose
Xfer PIDs are now 21057,21056
Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
This backup will fail because: tree connect failed: NT_STATUS_BAD_NETWORK_NAME
tree connect failed: NT_STATUS_BAD_NETWORK_NAME
Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
tree connect failed: NT_STATUS_BAD_NETWORK_NAME
tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 
filesTotal, 0 sizeTotal
Got fatal error during xfer (tree connect failed: NT_STATUS_BAD_NETWORK_NAME)
Backup aborted (tree connect failed: NT_STATUS_BAD_NETWORK_NAME)


I've been googling about this error and it seems everything is well 
configured but obviously I'm missing something. I've tried this via the 
console:

[r...@server]# smbclient ccnt1\\/cose -U administrator #command in 
XferLog.bad.z
Password:
Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
tree connect failed: NT_STATUS_BAD_NETWORK_NAME

[r...@server]# smbclient //ccnt1/\cose -U administrator #manual 
smbclient command in console
Password:
Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
smb: \ ls
. D 0 Thu Oct 29 17:01:39 2009
.. D 0 Thu Oct 29 12:50:12 2009
smb: \ exit
[r...@server]#

I will appreciate any comment or suggestion.

Regards,

--Ivan.





--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up BackupPC server

2009-11-03 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Ivan Arteaga wrote:
 Mirco Piccin wrote:
 Hi,
 I did configure the backup for one share I have in the server using 
 samba as /Xfer method/ but I am getting this message when the backup starts:
 
 2009-11-04 00:31:21 Got fatal error during xfer (tree connect failed: 
 NT_STATUS_BAD_NETWORK_NAME)
 2009-11-04 00:31:26 Backup aborted (tree connect failed: 
 NT_STATUS_BAD_NETWORK_NAME)

[SNIP]

 [r...@server]# smbclient ccnt1\\/cose -U administrator #command in 
 XferLog.bad.z
 Password:
 Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
 tree connect failed: NT_STATUS_BAD_NETWORK_NAME
 
 I will appreciate any comment or suggestion.

This is telling you that there it can't find the hostname you have asked
it to connect to. However, I think you are mad/crazy for trying to
backup the local linux machine using samba.

In my personal preference order you should use:
1) sudo + rsync or rsync over ssh to localhost
2) sudo + tar or tar over ssh to localhost
3) rsync
4) tar

There are no other methods to consider!

The reason I suggest rsync options first is to allow better detection of
deleted/renamed files, though there are no other benefits that I am
aware of.

Configuring localhost backups should be done exactly the same as all
your other hosts, except instead of using ssh you can use sudo. Search
this mailing list as there are plenty of examples on how to do this.

Regards,
Adam

- --
Adam Goryachev
Website Managers
www.websitemanagers.com.au
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkrxGcIACgkQGyoxogrTyiUVkwCdG4IsaXwtzTqIbfsBzWY/E9mA
g2kAoJAKmPY+AoEk71pOyog6PDnlg4uR
=BLb9
-END PGP SIGNATURE-

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up BackupPC server

2009-11-03 Thread Les Mikesell
Ivan Arteaga wrote:
 Mirco Piccin wrote:
 Hi,

 I want to schedule a backup for the BackupPC system and config 
 files. Is this possible?

 default installation of BackupPC provides also localhost backup.
 Localhost backup means of course BackupPC system and config files.
 Or, if it is not avalilable, you can add it.

 Usually BackupPC system files are stored here:
 /usr/share/backuppc
 , and config files are stored here:
 /etc/backuppc

 how can I configure the path to the files to backup if the server 
 and client would be in the same machine

 All backup files are stored in the $Conf{TopDir} variable defined in 
 /etc/backuppc/config.pl http://config.pl file (and also in web gui - 
 server configuration section).

 You will find localhost backup in $Conf{TopDir}/pc/localhost/X (where 
 X is the backup number).

 Hope this helps you
 Regards
 M

 

 --
 Come build with us! The BlackBerry(R) Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9 - 12, 2009. Register now!
 http://p.sf.net/sfu/devconference
 

 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
   
 Hello,
 
 I did configure the backup for one share I have in the server using 
 samba as /Xfer method/ but I am getting this message when the backup starts:
 
 2009-11-04 00:31:21 Got fatal error during xfer (tree connect failed: 
 NT_STATUS_BAD_NETWORK_NAME)
 2009-11-04 00:31:26 Backup aborted (tree connect failed: 
 NT_STATUS_BAD_NETWORK_NAME)
 
 This is the config file for the localhost in /etc/BackupPC/pc/ccnt1.pl 
 (ccnt1 is the server name where backuppc is running) I have no Xfer nor 
 smbShare configuration in the main /etc/BackupPC/config.pl
 
 [r...@server]# more /etc/BackupPC/pc/ccnt1.pl
 $Conf{SmbShareName} = [
   '/cose'
 ];
 $Conf{SmbSharePasswd} = 'xxx';
 $Conf{SmbShareUserName} = 'administrator';
 [r...@server]#
 
 This is the output from the XferLog.bad.z file:
 
 Running: /usr/bin/smbclient ccnt1\\/cose -U administrator -E -N -d 1 -c 
 tarmode\ full -Tc -
 full backup started for share /cose
 Xfer PIDs are now 21057,21056
 Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
 This backup will fail because: tree connect failed: NT_STATUS_BAD_NETWORK_NAME
 tree connect failed: NT_STATUS_BAD_NETWORK_NAME
 Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
 tree connect failed: NT_STATUS_BAD_NETWORK_NAME
 tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 
 filesTotal, 0 sizeTotal
 Got fatal error during xfer (tree connect failed: NT_STATUS_BAD_NETWORK_NAME)
 Backup aborted (tree connect failed: NT_STATUS_BAD_NETWORK_NAME)
 
 
 I've been googling about this error and it seems everything is well 
 configured but obviously I'm missing something. I've tried this via the 
 console:
 
 [r...@server]# smbclient ccnt1\\/cose -U administrator #command in 
 XferLog.bad.z
 Password:
 Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
 tree connect failed: NT_STATUS_BAD_NETWORK_NAME
 
 [r...@server]# smbclient //ccnt1/\cose -U administrator #manual 
 smbclient command in console
 Password:
 Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.0.28-0.el4.9]
 smb: \ ls
 . D 0 Thu Oct 29 17:01:39 2009
 .. D 0 Thu Oct 29 12:50:12 2009
 smb: \ exit
 [r...@server]#
 
 I will appreciate any comment or suggestion.


Why even consider samba with a linux target? It should work if you've 
configured 
samba to share something with that name (but the share is probably called cose, 
not /cose) and a user named administrator, but rsync over ssh is the normal 
approach.

For the local host you can use tar without ssh for efficiency but you have to 
change some of the commands.

-- 
   Les Mikesell
lesmikes...@gmail.com




--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backing up BackupPC server

2009-11-02 Thread iarteaga

Hello,

I am testing BackupPC in a CentOS box and it is working great, so far I am 
backing up three different  windows workstations without problems but now I 
want to backup the BackupPC server itself. I am using an external drive as 
backup storage and I want to schedule a backup for the BackupPC system and 
config files. Is this possible? if so, how can I configure the path to the 
files to backup if the server and client would be in the same machine.

I will appreciate your comments.

--Ivan.

+--
|This was sent by ivan.arte...@msn.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backing up BackupPC server

2009-11-02 Thread Ivan Arteaga
Hello,

I am testing BackupPC in a CentOS box and it is working great, so far I 
am backing up three different  windows workstations without problems but 
now I want to backup the BackupPC server itself. I am using an external 
drive as backup storage and I want to schedule a backup for the BackupPC 
system and config files. Is this possible? if so, how can I configure 
the path to the files to backup if the server and client would be in the 
same machine.

I will appreciate your comments.

--Ivan.

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up BackupPC server

2009-11-02 Thread Mirco Piccin
Hi,

 I want to schedule a backup for the BackupPC system and config files. Is
this possible?

default installation of BackupPC provides also localhost backup.
Localhost backup means of course BackupPC system and config files.
Or, if it is not avalilable, you can add it.

Usually BackupPC system files are stored here:
/usr/share/backuppc
, and config files are stored here:
/etc/backuppc

 how can I configure the path to the files to backup if the server and
client would be in the same machine

All backup files are stored in the $Conf{TopDir} variable defined in
/etc/backuppc/config.pl file (and also in web gui - server configuration
section).

You will find localhost backup in $Conf{TopDir}/pc/localhost/X (where X is
the backup number).

Hope this helps you
Regards
M
--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up BackupPC server

2009-11-02 Thread Ivan Arteaga
Mirco Piccin wrote:
 Hi,

  I want to schedule a backup for the BackupPC system and config 
 files. Is this possible?

 default installation of BackupPC provides also localhost backup.
 Localhost backup means of course BackupPC system and config files.
 Or, if it is not avalilable, you can add it.

 Usually BackupPC system files are stored here:
 /usr/share/backuppc
 , and config files are stored here:
 /etc/backuppc

  how can I configure the path to the files to backup if the server 
 and client would be in the same machine

 All backup files are stored in the $Conf{TopDir} variable defined in 
 /etc/backuppc/config.pl http://config.pl file (and also in web gui - 
 server configuration section).

 You will find localhost backup in $Conf{TopDir}/pc/localhost/X (where 
 X is the backup number).

 Hope this helps you
 Regards
 M

 

 --
 Come build with us! The BlackBerry(R) Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9 - 12, 2009. Register now!
 http://p.sf.net/sfu/devconference
 

 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
   
Hello Mirco,

Thanks for the reply, I will try it and post results.

Regards,

--Ivan.

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/