Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-18 Thread Roy Keene (Contractor)
Les Mikesell wrote:
 Timothy J. Massey wrote:
 Again, I ask everyone:  does anyone have a better solution?  I have 
 heard only two solutions to the off-site storage issue.  1) Do an 
 archive to some sort of removable media.  Given the storage 
 requirements, I don't see how it could be anything *other* than a hard 
 drive.  If you were going to spend four figures to buy a big enough tape 
 library, wouldn't you use it for backup directly?  Or 2) create some 
 sort of RAID-1 configuration and break it regularly to swap out a drive 
 and store it off-site.  Someone even recommended a 3-drive RAID-1, so 
 the data stays redundant, but you can still break the array.  It doesn't 
 change the workload on the array, however...
   
 
 If you have a suitable offsite location with enough internet bandwidth, the
 low-maintenance way is to throw a vpn between sites and let an independent
 instance of backuppc run from there, backing up the same targets.  You
 might need to make the initial copy on the local LAN or perhaps add each
 new target on a Friday and let it run through the weekend to get started.
 

When I was using BackupPC, I used file_sync (part of BackupPCd) to do a 
byte-level mirror (rsync-like) everynight from the BackupPC machine to the 
offiste mirror.

My BackupPC array was 1TB.

-- 
Roy Keene (Contractor)

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-18 Thread Roy Keene (Contractor)
Timothy J. Massey wrote:
 Roy Keene (Contractor) [EMAIL PROTECTED] wrote on 01/16/2007 
 02:38:06 PM:
 
   When I was using BackupPC, I used file_sync (part of BackupPCd) to 
 do a
   byte-level mirror (rsync-like) everynight from the BackupPC machine 
 to the
   offiste mirror.
 
 Is this the same file_sync from OpenSync (http://www.opensync.org/) or 
 something else?  I did a google for BackuuPCd and came up with what I 
 imagine is your site (http://www.rkeene.org/oss/backuppcd/).  I 
 downloaded both listed tars, but could not find file_sync in them.  The 
 only logical reference I could come for file_sync was the OpenSync site 
 above.
 
 Is BackupPCd a supported client for BackupPC?
 
 Tim Massey
 

That is my site.

$ tar -ztvf backuppcd-200603161528.tar.gz
...
-rw-r--r-- rkeene/users  10783 2006-03-11 11:11 
backuppcd-200603161528/tools/file_sync.c

BackupPCd is still in progress, and not recommended for production use (though 
I 
have used it for such in the past).

-- 
Roy Keene (Contractor)

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Timothy J. Massey
[EMAIL PROTECTED] wrote on 01/16/2007 
03:37:09 AM:

  Timothy J. Massey wrote:
 
   BackupPC's pool is stored on a large internal hard drive.  Every
   day at
   a little after 7:00 A.M., the backup server starts an archive of each
   host, which is stored on a second hard drive that is mounted in a
   removable tray.  Once this is complete, the user can shut down the
   server, remove the hard drive, replace it with a different one, and
   turn
   the server back on.  Once the new drive is in place, it is
   repartitioned, reformatted and remounted in place, ready for the next
   archive.
 
  Won't the frequent reformatting et al. wear out your hard drives
  pretty fast?

How is a couple of formats going to wear out a drive?  I did not go into 
further detail, but now I will:

1) As opposed to the article you linked, this is on the *backup* server 
not the *file* server.  It's *supposed* to work real hard, so the file 
server doesn't have to.

2) There are three removable hard drives that are swapped weekly, not 
daily.  Therefore, they are only formatted weekly, not daily.

3) Like any media, the drives are scheduled for replacement.  Every 3 
months, a drive is archived permanently.  That means that no drive lasts 
for more than 9 months in production, of which only 3 months are spent 
spinning.  Assuming 4.33 weeks a month, each drive gets formatted 
exactly 13 times.

4) As much as I would like to do a full surface-scan of each drive when 
I format, there just isn't time to do that on a 500GB drive.  So, these 
are merely doing quick formats.  If a drive can't handle *10* minutes 
of extra drive activity every week, then why exactly are we using it for 
backup in the first place?

5) While doing a daily archive *might* be slightly harder on the 
*internal* hard drive, which does not get swapped, can you think of a 
better way of achieving off-site storage of the backup data?  I'd love 
to hear a better way...

There were *so* many more problems in the article you linked than the 
fact the drive had to rebuild daily:  the fact that a desktop hard drive 
died after working for *years* in very high temperatures doesn't sound 
very unreasonable, does it?  The fact that someone depended upon *that* 
for their data storage is the problem, not the fact that the drive had 
to spend an hour or two a day copying itself, in a nice, linear 
non-seeking way.  It's not like the drive would have stopped spinning 
during that time...


Again, I ask everyone:  does anyone have a better solution?  I have 
heard only two solutions to the off-site storage issue.  1) Do an 
archive to some sort of removable media.  Given the storage 
requirements, I don't see how it could be anything *other* than a hard 
drive.  If you were going to spend four figures to buy a big enough tape 
library, wouldn't you use it for backup directly?  Or 2) create some 
sort of RAID-1 configuration and break it regularly to swap out a drive 
and store it off-site.  Someone even recommended a 3-drive RAID-1, so 
the data stays redundant, but you can still break the array.  It doesn't 
change the workload on the array, however...


  Network administrators who fear the command line... What is the world
  coming to, eh?

A sad, but profitable, conclusion?

Tim Massey

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Nils Breunese (Lemonbit)

Timothy J. Massey wrote:


Won't the frequent reformatting et al. wear out your hard drives
pretty fast?


How is a couple of formats going to wear out a drive?  I did not go  
into

further detail, but now I will:

snip

There were *so* many more problems in the article you linked than the
fact the drive had to rebuild daily:  the fact that a desktop hard  
drive

died after working for *years* in very high temperatures doesn't sound
very unreasonable, does it?  The fact that someone depended upon  
*that*

for their data storage is the problem, not the fact that the drive had
to spend an hour or two a day copying itself, in a nice, linear
non-seeking way.  It's not like the drive would have stopped spinning
during that time...


I'm sorry, reading this it feels like you think I was attacking you  
or your methods. I wasn't really, I was genuinely wondering. And  
well, it was just that I read that extreme story on TheDailyWTF the  
day before that made me think about it. And yes, I was thinking about  
low-level formats, but you're obviously not doing those every day.



Network administrators who fear the command line... What is the world
coming to, eh?


A sad, but profitable, conclusion?


I guess so.

Nils Breunese.


PGP.sig
Description: Dit deel van het bericht is digitaal ondertekend
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Timothy J. Massey
[EMAIL PROTECTED] wrote on 01/16/2007 
12:59:50 PM:

  Timothy J. Massey wrote:
 
   Won't the frequent reformatting et al. wear out your hard drives
   pretty fast?
  
   How is a couple of formats going to wear out a drive?  I did not go
   into
   further detail, but now I will:
  
   snip
  
   There were *so* many more problems in the article you linked than the
   fact the drive had to rebuild daily:  the fact that a desktop hard
   drive
   died after working for *years* in very high temperatures doesn't sound
   very unreasonable, does it?  The fact that someone depended upon
   *that*
   for their data storage is the problem, not the fact that the drive had
   to spend an hour or two a day copying itself, in a nice, linear
   non-seeking way.  It's not like the drive would have stopped spinning
   during that time...
 
  I'm sorry, reading this it feels like you think I was attacking you
  or your methods. I wasn't really, I was genuinely wondering. And
  well, it was just that I read that extreme story on TheDailyWTF the
  day before that made me think about it. And yes, I was thinking about
  low-level formats, but you're obviously not doing those every day.

I'm sorry if I was snippy.  I saw *very* little in common between the 
link you sent and the solution that I've implemented.  Other than the 
fact that they both involved hard drives...  :)

Frankly, I do *not* like my solution.  I think it has a number of downsides:

* My archives have no history associated with them
* I lose any kind of pooling in my archives
* It's an extra step to get the archives in the first place, and extra 
steps are extra places to fail
* My archives are encapsulated into a single large file that I then 
have to unpack somewhere
* For now, my archive jobs are not integrated into the BackupPC GUI

and all kinds of other issues.  However, it's kind of like Winston 
Churchill's quote on democracy:  it's the worst choice, except for all 
the others...

Fortunately, the only time these limitations come into play is in the 
event of total disaster:  if both the e.g. file server *and* the backup 
server are unrecoverable.  Unfortunately, if I'm ever actually in that 
situation, it's the time I want the *least* hassle with my backup!  :)

So, again:  I am *very* open to criticism, if you can suggest something 
better!  I've mentioned the break-RAID-1-array solution.  I understand 
the rationale, but I've decided that this is preferable.  Are there any 
other solutions out there?

Or are you all going without off-site backup?  :)

Tim Massey


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Les Mikesell
Timothy J. Massey wrote:
 Again, I ask everyone:  does anyone have a better solution?  I have 
 heard only two solutions to the off-site storage issue.  1) Do an 
 archive to some sort of removable media.  Given the storage 
 requirements, I don't see how it could be anything *other* than a hard 
 drive.  If you were going to spend four figures to buy a big enough tape 
 library, wouldn't you use it for backup directly?  Or 2) create some 
 sort of RAID-1 configuration and break it regularly to swap out a drive 
 and store it off-site.  Someone even recommended a 3-drive RAID-1, so 
 the data stays redundant, but you can still break the array.  It doesn't 
 change the workload on the array, however...
   

If you have a suitable offsite location with enough internet bandwidth, the
low-maintenance way is to throw a vpn between sites and let an independent
instance of backuppc run from there, backing up the same targets.  You
might need to make the initial copy on the local LAN or perhaps add each
new target on a Friday and let it run through the weekend to get started.

-- 
  Les Mikesell
   [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Carl Wilhelm Soderstrom
On 01/16 02:11 , Timothy J. Massey wrote:
 Or are you all going without off-site backup?  :)

the way we've been doing off-site backup is usually:

- get a second backuppc server somewhere offsite, backing up the most
  critical information
OR
- use a tape backup system (which has the advantage of being a redundant
  backup system).

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Timothy J. Massey
Roy Keene (Contractor) [EMAIL PROTECTED] wrote on 01/16/2007 
02:38:06 PM:

  When I was using BackupPC, I used file_sync (part of BackupPCd) to 
do a
  byte-level mirror (rsync-like) everynight from the BackupPC machine 
to the
  offiste mirror.

Is this the same file_sync from OpenSync (http://www.opensync.org/) or 
something else?  I did a google for BackuuPCd and came up with what I 
imagine is your site (http://www.rkeene.org/oss/backuppcd/).  I 
downloaded both listed tars, but could not find file_sync in them.  The 
only logical reference I could come for file_sync was the OpenSync site 
above.

Is BackupPCd a supported client for BackupPC?

Tim Massey

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-16 Thread Timothy J. Massey
Les Mikesell [EMAIL PROTECTED] wrote on 01/16/2007 02:28:46 PM:

  If you have a suitable offsite location with enough internet 
bandwidth, the
  low-maintenance way is to throw a vpn between sites and let an 
independent
  instance of backuppc run from there, backing up the same targets.

We're doing that for a couple of smaller clients.  So far, after the 
initial sync, it's working out well, but this is without dealing with 
large files.  Most of our clients are running something like Domino or 
Exchange, both of which put a greater load on backup (either because 
you're backing up multi-gigabyte files, or because you're trying to back 
up what has got to be the *worst* way of storing mail data ever 
conceived--I'll leave you to decide which is which).

Tim Massey

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] CGI scripts to manage removable hard drive media.

2007-01-15 Thread Timothy J. Massey
Hello!

Here are the scripts I use to manage the removable hard drive media used 
to store daily archives of my backup servers via the GUI, instead of 
from the command line.  Briefly, the system is set up like this:

BackupPC's pool is stored on a large internal hard drive.  Every day at 
a little after 7:00 A.M., the backup server starts an archive of each 
host, which is stored on a second hard drive that is mounted in a 
removable tray.  Once this is complete, the user can shut down the 
server, remove the hard drive, replace it with a different one, and turn 
the server back on.  Once the new drive is in place, it is 
repartitioned, reformatted and remounted in place, ready for the next 
archive.

There are two scripts that make this happen.  The first one simply shuts 
the server down.  The second one handles the repartitioning, 
reformatting and remounting.

There is absolutely no reason why this couldn't be handed by simply 
ssh'ing into the server.  Except that these servers are destined for 
network administrators for whom the command line is a tremendously evil 
thing, and if you try to sell them a solution that contains instructions 
like Use putty, log into the server and type this command, they will 
say no.  Hence, the CGI scripts...

Because the scripts will be run by the webserver, they will be run with 
its permissions, which likely do not include the ability to shut down 
the server, or other such commands.  The way I have done this is to use 
sudo, with the proper lines in sudoers.  I've tried to make the commands 
as specific as posssible, to avoid possible security issues.

To partition the drive, I am echoing responses to the fdisk command.  I 
looked into parted, but I could not find a clean way of getting it to 
create a single large partition without knowing how big the partition 
was.  Seeing as this will be used with drives of different sizes, I 
decided to stick with fdisk.

Also, the HTML files that are cat to the user are created simply by 
saving any old BackupPC HTML page to a file, and chopping the part 
before the main body DIV into the top file, and the part after the main 
body DIV into the bottom file.

If you have any suggestions as to how to make this script better, I 
would be happy to hear them.  Otherwise, I hope they are useful to 
someone else.

Tim Massey




#!/bin/sh
# shutdown.cgi - Shut down server
echo Content-type: text/html
echo 
cat bpc_top.html
echo 
div id=\Content\
div class=\h1\Shut Down Server/div
ph2The system is being shut down!/h2/p
pThis will take approximately 60 seconds. Do not remove the drive before
the system has powered itself off./p
cat bpc_bottom.html
sudo /sbin/shutdown -h now /dev/null 21
exit



#!/bin/sh
# instmedia.cgi - Install new media for BackupPC Archive
echo Content-type: text/html
echo 
cat bpc_top.html
echo 
div id=\Content\
div class=\h1\Initialize Removable Media/div
ph2Initializing Removable Media/h2/p
pThis will take approximately 10 minutes to complete, depending upon the
size of the removable drive.  Do not naviagate away from this page./p

echo PBUnmounting removable drive./B/P
sudo /bin/umount /var/lib/BackupPC/removable 21
if [ `sudo /bin/df /var/lib/BackupPC/removable | grep 
/var/lib/BackupPC/removable | wc -l` = 1  ]; then
   echo PH2Error:  drive did not unmount./H2/P
   cat bpc_bottom.html
   exit
fi

echo PBCreating proper partition on drive./B/P
PThis will take approximately 45 seconds.  Please wait./P
# Pipe responses for fdisk command via echo.
# This does the following:
#  The first series of lines will delete all partitions on a drive with 
up to 9
#   partitions.  It does this by having pairs of delete commands:  d9, 
d8, etc.
#   until it gets to the end.  When there's just one partition, fdisk
#   doesn't ask for a number, so the last one is just a d.
#   This will actually generate lots of errors in practice:  when there 
are no
#   more partitions left, the d's will generate an error saying that 
there are
#   no partitions to delete, and the numbers are interpreted as nonsense
#   commands.  However, this is harmless.
#  It then goes through the sequence to create a new partition:
#   n (New partition)
#   p (Primary parition)
#   1 (First partition)
#   ENTER (Default starting cylinder is the first one)
#   ENTER (Default ending cylinder is the last one)
#   w (Write the changes to disk and exit)
#  Several newlines are added at the end in case something goes wrong.
#  Three newlines in a row is interpreted by the fdisk command by
#  exiting immediately.
echo 

d
9
d
8
d
7
d
6
d
5
d
4
d
3
d
2
d
n
p
1


w




 | sudo /sbin/fdisk /dev/hdc /dev/null

echo PBFormatting partition for use./B/P
PThis can take up to 10 minutes.  Please wait./P
sudo /sbin/mke2fs -j -m 1 -LRemovableData /dev/hdc1 /dev/null

echo PBMounting drive./B/P
sudo /bin/mount /var/lib/BackupPC/removable 21
if [ `sudo /bin/df /var/lib/BackupPC/removable | grep 
/var/lib/BackupPC/removable | wc -l` = 0  ]; then
   echo PH2Error: