Hi, Ted

I'll answer some in the middle of your mail, but there's some things I would
like to see first:

1- With backuppc you may say I want to backup everything from that client
("/" or "C:\ and D:\, etc...") or instead say I want to backup just this,
and this, but not that- the concept is "shares".

So you may have two (or more) backuppc servers if that helps your questions.
One backup this, the other backs up that and the clients.

backuppc needs very little maintenance, so I think that's an option.

2- backuppc is very little demanding of the machine. Any machine with apache
will do it. That besides the pool thing :)

3- About pool. It can be on the server disk(s), RAID1 would be a good idea,
or any kind of external mounted partition. A NFS share, for example. But
you'll be limited by the speed and bandwidth of your LAN. backups will
travel twice, I think.

4- I understood you want to archive to DVD and free space.
Archive is somehow direct. To delete older backups (not automaticaly, not by
schedule, that is) you may use a script that was shared in this mailing list
(I can send it to you)

Luis Paulo:
>
> Yes, I agree that I next need to deal with the Clients issue.  But I
> still have a concern having BackupPC plus the Server plus the Pool all
> on the same machine 192.168.1.16 having the name "Ubuntu".
>
> Maybe it would better help you help me if I explained a few things.
>
> May application involves the collection of web pages associated with the
> news.  I am talking about thousands of web pages every week.  Once a web
> page (or web tree) is collected it is static forever -- that is to say
> it's contents and appearance will never change.  However, I and others
> will need to access these web pages by means of a browser.  This next
> part is very important.  I cannot leave these collections of web pages
> on dynamic media for several reasons.
>
> Reason #1 is the collection process itself.  I use Firefox with add-ons
> which makes it possible to collect and export these collections.  So
> that there is an operation occurring someplace under root "/" and
> accumulates these collections (my data) while still located under "/"
> for Linux and C:\... for Windows MS. (BTW I am trying to phase out all
> MS Windows machines to Linux machines) but the process of change is slow
> and there are still MS OS XP machines involved). The main thing I am
> here trying to say is that there is data build up right inside the
> processing scheme and if I don't regularly export and move this data out
> to another machine or disk then "/" would be full of data and eventually
> all operations would fail not just BackupPC.  I know this is true
> because I watch the build up of data and then the change to more free
> space once it is exported and moved. Right now BackupPC seems to be
> focused on the Ubuntu Linux OS which is OKAY because that needs to be
> regularly backed up even though there are only small changes.
>
> Backuppc is super to backup similar data from diferent clients. It can see
files are the same, store it only once and uses hardlinks. Don't agree with
focused on backing up OS, but not entirely false.


> Reason #2 is the BackupPC Pool which also manages the backed up files. I
> have tried to keep "/" clear of excessive data build up, that is I check
> to make sure that it does not build up and cause problems.
>
> BUT as soon as I go to add clients from other machines I'm going to be
> in a lot of trouble and I won't be able to control the build up of data
> in the BackupPC Pool and also handle everything else on 192.168.1.16.
> THEREFORE is it possible to MOVE the backup POOL to another machine
> specifically set up for that purpose.
>

Answered, I think. I'll move the server and pool to another machine.

>
> Reason #3 is sort of the same problem.  I need to be able to keep data
> (NOT OS data but NEWS data in the form of web pages) on both DVD as well
> as magnetic (drives hd, sd, etc.) media.  This means that there is a
> duplication occuring.  Unfortunately for me I have experienced failure
> of hard drives and external usb drives which resulted in major data loss
> and even entire machines.  So now I am paranoid.  I lost weeks if not
> months of data.
>
> You have the pool(s), and archive the share(s) with what you want to DVD
I have some experience on DVD failure, regret to say.

Archive, AFAIK, is a compressed file, store it on DVD or move it to another
media (or both)

Delete the backups after with a script, or wait for the automatic delete
after you delete the files from disk.


> What these 3 reasons come down to is that I have two kinds of data.  One
> kind is the operating system and the data it generates as it executes
> various processes.  The other kind of data is tons of web pages being
> generated and then exported and then moved as well as copies on DVD.
> This last category means there is a constant build up of NEW data.
>

a) use diferent shares
b) use 2 servers

>
> The only incremental data would (I think) be that associated with the
> Operating System as changes occur and temporary data changes get created
> and then moved out of "/".
>
> Since all machines are likely sooner or later (just like the magnetic
> media within them) to die at the worst possible time I need to be able
> to set up something along side of BackupPC (not in place of but in
> addition to) in order to replace one machine with a new machine.  I have
> been working on a GhostLinux "G4L" type of approach. But I want to get
> BackupPC up and running first in order to deal with the CLIENTS. After
> that I guess you cannot help me because BackuPC is not associated with
> "G4L" -- right?
>
> Ghost is fine, I used to use it as fast recovery, creating the images on
disk.
Just don't ghost the pool, I guess - make pool a partition, so you may
exclude it from ghost.


> Hope all this makes sense.
>
> I certainly appreciate your help so far. This stuff was just a bit new
> for me to feel confident about.
>
> Thanks -- Ted Hilts
> Looking forward to your response, Ted
>
>
>
Hope haven't said anything stupid. Anyone, please correct me.
Didn't answer all your questions, probably, but hope will get to it :)

Finnaly, backuppc may not be the perfect tool to backup a bunch of web pages
from diferent sites.
But it will do the job.

Luis
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
BackupPC-users mailing list
[email protected]
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to