Re: [BackupPC-users] BackupPC server on OS X

2009-11-15 Thread Nick Smith
thanks for the reply, its just weird it worked for you and when i try
to install the dep by itself i get:

bash-3.2# port install p5-compress-zlib
Error: Port p5-compress-zlib not found

On Sat, Nov 14, 2009 at 10:22 AM, Michael Stowe
mst...@chicago.us.mensa.org wrote:

 Or maybe you're asking questions about macports that have little or
 nothing to do with BackupPC...

 It ran fine on my Snow Leopard, and I didn't have any trouble with
 dependencies, so I don't know what to tell you except that
 p5-compress-zlib installs fine from macports here.

 I suppose you could always install it manually if you can't get macports
 to work.

 Wow, guess this is uncharted territory.

 On Fri, Nov 13, 2009 at 7:24 AM, Nick Smith nick.smit...@gmail.com
 wrote:
 On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
 thomasvoney...@gmail.com wrote:

 HI List,

 Just once more my question
 Is anyone on the list actually running BackupPC on OS X?

 As described below I am having some difficulties in getting the perl
 environment satisfactory for using BackupPC and therefore I would
 like
 to hear from other that is using OS X to host the BackupPC server and
 - hopfully - get some good advice.

 Did you ever get backuppc working on OS X?

 Im actually trying to set it up now on snow leopard (not server)

 Ive installed macports and updated it, but when i try to install
 backuppc i get this error:

 bash-3.2# port install backuppc
 ---  Computing dependencies for backuppcError: Dependency
 'p5-compress-zlib' not found.

 How did you get past this issue?

 Is it even possible to run backuppc server on a mac?

 I wonder why they would have this port in macports and not have all
 the deps in place for it.

 Thanks for any help you can give.





 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
 trial. Simplify your report design, integration and deployment - and focus on
 what you do best, core application coding. Discover what's new with
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC server on OS X

2009-11-14 Thread Nick Smith
Wow, guess this is uncharted territory.

On Fri, Nov 13, 2009 at 7:24 AM, Nick Smith nick.smit...@gmail.com wrote:
 On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
 thomasvoney...@gmail.com wrote:

 HI List,

 Just once more my question
 Is anyone on the list actually running BackupPC on OS X?

 As described below I am having some difficulties in getting the perl
 environment satisfactory for using BackupPC and therefore I would like
 to hear from other that is using OS X to host the BackupPC server and
 - hopfully - get some good advice.

 Did you ever get backuppc working on OS X?

 Im actually trying to set it up now on snow leopard (not server)

 Ive installed macports and updated it, but when i try to install
 backuppc i get this error:

 bash-3.2# port install backuppc
 ---  Computing dependencies for backuppcError: Dependency
 'p5-compress-zlib' not found.

 How did you get past this issue?

 Is it even possible to run backuppc server on a mac?

 I wonder why they would have this port in macports and not have all
 the deps in place for it.

 Thanks for any help you can give.


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC server on OS X

2009-11-13 Thread Nick Smith
On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
thomasvoney...@gmail.com wrote:

 HI List,

 Just once more my question
 Is anyone on the list actually running BackupPC on OS X?

 As described below I am having some difficulties in getting the perl
 environment satisfactory for using BackupPC and therefore I would like
 to hear from other that is using OS X to host the BackupPC server and
 - hopfully - get some good advice.

Did you ever get backuppc working on OS X?

Im actually trying to set it up now on snow leopard (not server)

Ive installed macports and updated it, but when i try to install
backuppc i get this error:

bash-3.2# port install backuppc
---  Computing dependencies for backuppcError: Dependency
'p5-compress-zlib' not found.

How did you get past this issue?

Is it even possible to run backuppc server on a mac?

I wonder why they would have this port in macports and not have all
the deps in place for it.

Thanks for any help you can give.

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Returned mail: see transcript for details

2009-06-25 Thread Nick Smith
Why do i keep getting these messages sent to me directly as a reply
from the list?

On Wed, Jun 24, 2009 at 2:55 PM, bpc.20.hypa...@xoxy.net wrote:
 The original message was received at Wed, 24 Jun 2009 18:55:00 GMT
 from j...@localhost

   - The following addresses had permanent fatal errors -
 bpc.20.hypa...@xoxy.net
    (expanded from: bpc.20.hypa...@xoxy.net)

   - Transcript of session follows -
 553 5.0.0 General list for user discussionquestions and support - 
 backuppc-users@lists.sourceforge.net 
 +bpc+h...ists.sourceforge@spamgourmet.com
        questions and support backuppc-users@lists.sourceforge.net... 
 Unbalanced ''

 Final-Recipient: RFC822; bpc.20.hypa...@xoxy.net
 Action: failed
 Status: 5.0.0
 Last-Attempt-Date: Wed, 24 Jun 2009 18:55:02 GMT



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] email for completed backup

2009-06-24 Thread Nick Smith
I would like to know if backuppc has the ability to not only send an
email if a backup hasn't complete in a set amount of time, but also
when a backup is completed?
If not, is it possible to submit a feature request?

I would really like to be able to email clients that there computers
were successfully backed up each night.

I have yet to find a good way to go about doing this, ive tried the
pre/post command, but it seems that they run even when a backup has
failed.

Is there a good way to go about this?  What are others doing to accomplish this?

Thanks for the help.

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Putting the pool onto an external drive, such as a USB drive

2009-06-24 Thread Nick Smith
 but switched to sata in a trayless hot-swap carrier
 when I went to larger drives.

Do you have any links to the hardware you used? I too am looking for
such a solution.

Thanks.

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problems with linux host - Unable to read 4 bytes

2009-06-24 Thread Nick Smith
 

 It should not change anything, since the BackupPC dæmon is launched by root.
  However, most installations intentionally do not create a password for the
 backuppc user for security reasons.  From root, running su -s /bin/bash
 backuppc should switch to that user without having to change anything.
 Regarding your SSH key problem, what is the output when you run the
 following?
 r...@host# sudo -u backuppc ssh r...@target
 Does it ask anything?  If it fails, what is the message it returns?


From the backuppc server to the client1 it logs in without a password.

From the backuppc server to the client2 it asks for a password.

From the client1 to the backuppc server it asks for a password

From the client2 to the backuppc server i get this error:
# sudo -u backuppc ssh -p 22252 r...@backup-server
sudo: no passwd entry for backuppc!

Which is where i tried to add the backuppc user to the client machine
to see if that was the problem.

If i understand correctly the client is acting as the ssh server?

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] email for completed backup

2009-06-24 Thread Nick Smith
ok i admit its mainly for me. id like to check my email in the morning
and see that the machines backed up and not have to bother with
logging into the web interface. and i do have some customers
requesting to get an email letting them know everything is ok rather
than waiting for days and getting an email that there system hasnt
been backed up in X amount of days
some people want to know, others dont care as long as its working.

On Wed, Jun 24, 2009 at 10:52 AM, Les Mikeselllesmikes...@gmail.com wrote:
 Nick Smith wrote:
 I would like to know if backuppc has the ability to not only send an
 email if a backup hasn't complete in a set amount of time, but also
 when a backup is completed?
 If not, is it possible to submit a feature request?

 I would really like to be able to email clients that there computers
 were successfully backed up each night.

 I have yet to find a good way to go about doing this, ive tried the
 pre/post command, but it seems that they run even when a backup has
 failed.

 Is there a good way to go about this?  What are others doing to accomplish 
 this?

 Basically, if you send email every day you'll train the users to ignore
 it and they won't read it when they get the one that says that backups
 have failed for three days in a row either.

 You could just email a link to the host summary page if you think they
 need it.

 --
   Les Mikesell
    lesmikes...@gmail.com


 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] email for completed backup

2009-06-24 Thread Nick Smith
Thanks for the response, it looks pretty good, but i have one problem
that maybe you can help me with.
I already use the pre/post commands to launch scripts that do volume
shadows on my windows servers
for backup.  Every time i try to have 2 pre or post commands the
latter doesnt work. ive tried the normal
linux way of  or  but that doesnt work, commas dont work either.

Does anyone know how to get 2 post commands to run in order?

When i first started looking at getting the emails working i tried the
post command and never could get
it to work with 2 commands. and like i said before, the post command
will run whether or not the backup
actually completed successfully or failed.  your script looks like it
will filter well for me, i just have to figure
out how to get it to run on a post command when there already is one,
OR try to integrate it into my existing
setup.

thanks again for the post.

On Wed, Jun 24, 2009 at 11:51 AM, Brian
Ericksonbrian.erick...@apigroupinc.us wrote:
 Nick,

 Here are the steps that I user on my BackupPC servers to do just what you
 are looking for.

 Under the main server config (Server | Edit Config) and the Backup Settings
 tab:
 DumpPostUserCmd: /etc/BackupPC/backupmail.sh $user $xferOK $host $type
 RestorePostUserCmd: /etc/BackupPC/restoremail.sh $user $xferOK $host
 $fileList

 The two .sh files are attached as a ZIP file.

 NOW: the e-mails will contain a few lines from the log files, but I haven't
 figured out how to make it send the most RECENT log (the ones included are
 the LAST logs)

 So in my e-mail, I have it filter out e-mails that have the words
 Successful and Backup into a folder for later review.  That way any
 failures show up in my inbox for attention.

 Also: be sure to create the success and failure directories under
 /etc/BackupPC (or comment out those lines in the .sh file)

 Good luck!

 Brian Erickson
 Information Systems Manager
 APi Group, Inc.
 Direct: 651-604-2761
 Main: 651-636-4320
 Fax: 651-604-2781
 Cell: 651-270-3507
 www.apigroupinc.com




 On Wed, Jun 24, 2009 at 10:32 AM, Nick Smith nick.smit...@gmail.com wrote:

 ok i admit its mainly for me. id like to check my email in the morning
 and see that the machines backed up and not have to bother with
 logging into the web interface. and i do have some customers
 requesting to get an email letting them know everything is ok rather
 than waiting for days and getting an email that there system hasnt
 been backed up in X amount of days
 some people want to know, others dont care as long as its working.

 On Wed, Jun 24, 2009 at 10:52 AM, Les Mikeselllesmikes...@gmail.com
 wrote:
  Nick Smith wrote:
  I would like to know if backuppc has the ability to not only send an
  email if a backup hasn't complete in a set amount of time, but also
  when a backup is completed?
  If not, is it possible to submit a feature request?
 
  I would really like to be able to email clients that there computers
  were successfully backed up each night.
 
  I have yet to find a good way to go about doing this, ive tried the
  pre/post command, but it seems that they run even when a backup has
  failed.
 
  Is there a good way to go about this?  What are others doing to
  accomplish this?
 
  Basically, if you send email every day you'll train the users to ignore
  it and they won't read it when they get the one that says that backups
  have failed for three days in a row either.
 
  You could just email a link to the host summary page if you think they
  need it.
 
  --
    Les Mikesell
     lesmikes...@gmail.com
 
 
 
  --
  ___
  BackupPC-users mailing list
  BackupPC-users@lists.sourceforge.net
  List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:    http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
 


 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



 --

 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http

Re: [BackupPC-users] problems with linux host - Unable to read 4 bytes

2009-06-23 Thread Nick Smith
 The root setup on the backuppc server side is irrelevant.  You may have
 set it up in addition to the correct backuppc setup.  You don't need it.

 The command that backuppc is running in the config is:
 $sshPath -q -x -p 22200 -l root $host $rsyncPath $argList+
 So i thought i had to have root access to the box.

 You need root access on the target side to be able to access all the
 files.  But, you want the keys for the backuppc user on the backuppc
 server side to give you that access, because that is how the command is run.


Ok, new problem, i cant get the backuppc user to work with shared keys.
Ive added the /var/lib/backuppc/.ssh/id_rsa.pub contents to the other
linux box's
authorized_keys and _keys2 file and it always prompts for a password.
The password
doesnt work for the backuppc user.  I tried logging into the backup
server as the user
backuppc and it wont let me either.  Im trying the password i use to
log into the web
interface.  I dont recall setting up a backuppc user or its password,
so im assuming
that is the problem, but i figured it would be the same as the web interface.

What do i do at this point? reset the password for the backuppc user
to something
that i know? Im assuming this will break something with the backups if
backuppc setup
the account on installation and set the password to something and its
using that password
when it does its thing right?  Is there a better way to go about this?
I dont want to screw up
this box.

there is no /home/backuppc directory, i think its in
/var/lib/backuppc, at least thats where
the .ssh folder got installed to when i ran ssh-keygen.  Does the
backuppc user even
have rights to login to the box?

Starting to get confused.

thanks for the help.

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problems with linux host - Unable to read 4 bytes

2009-06-23 Thread Nick Smith
On Tue, Jun 23, 2009 at 2:46 PM, Matthias Meyermatthias.me...@gmx.li wrote:
 Nick Smith wrote:

 Ok, new problem, i cant get the backuppc user to work with shared keys.
 Ive added the /var/lib/backuppc/.ssh/id_rsa.pub contents to the other
 linux box's
 in /root/.ssh I assume.

yes into authorized_keys and _keys2 which ever one it decides to use.


 The password for user backuppc in the web interface can differ from the
 password for the linux user.
 apache compare the password against the htpasswd or htdigest file.
 Linux, on the other hand, compare it against the /etc/passwd.


If I dont know the backuppc user password, is it safe to change it
something else?
Ot will this screw up some other internal workings of backuppc?

thanks for the help

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problems with linux host - Unable to read 4 bytes

2009-06-23 Thread Nick Smith
Does anyone know if i can change the password to the backuppc user in linux
and not have any adverse effects with the backuppc system?

On Tue, Jun 23, 2009 at 3:17 PM, Nick Smithnick.smit...@gmail.com wrote:
 On Tue, Jun 23, 2009 at 2:46 PM, Matthias Meyermatthias.me...@gmx.li wrote:
 Nick Smith wrote:

 Ok, new problem, i cant get the backuppc user to work with shared keys.
 Ive added the /var/lib/backuppc/.ssh/id_rsa.pub contents to the other
 linux box's
 in /root/.ssh I assume.

 yes into authorized_keys and _keys2 which ever one it decides to use.


 The password for user backuppc in the web interface can differ from the
 password for the linux user.
 apache compare the password against the htpasswd or htdigest file.
 Linux, on the other hand, compare it against the /etc/passwd.


 If I dont know the backuppc user password, is it safe to change it
 something else?
 Ot will this screw up some other internal workings of backuppc?

 thanks for the help


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] problems with linux host - Unable to read 4 bytes

2009-06-22 Thread Nick Smith
Im in the process of adding some linux clients to the backuppc server.

Ive successfully setup one server on the backup server.

The second one is giving me problems.

I set it up the same way, i can ssh from one box to the other as root
with no password.
The shared keys seem to be working correctly.

Ive looked at the other box that works and the settings are identical.

Ive upgraded all the boxes via yum and apt as per a suggestion i found
on google but
that didnt do anything for my problem, but now im running the latest
version of openssl.
After upgrading i did recreate the shared keys for ssh and put the new
keys into the authorized_keys2 file
on both machines.

The error message as you all know im sure:

full backup started for directory /
Running: /usr/bin/ssh -q -x -p 22200 -l root webserver /usr/bin/rsync
--server --sender --numeric-ids --perms --owner --group -D --links
--hard-links --times --block-size=2048 --recursive --ignore-times . /
Xfer PIDs are now 16974
Rsync command pid is 16974
Read EOF: Connection reset by peer
Tried again: got 0 bytes
Done: 0 files, 0 bytes
Got fatal error during xfer (Unable to read 4 bytes)
Backup aborted (Unable to read 4 bytes)
Not saving this as a partial backup since it has fewer files than the
prior one (got 0 and 0 files versus 0)

Communication working both directions.
r...@backup-server:/root/.ssh# ssh -p 22200 -l root webserver whoami
root

r...@server:~# ssh -p 22252 -l root backup-server whoami
root

The strange thing is the first server i setup works fine, which was
running centos.
This one that has been giving me fits all day is running ubuntu just
like the backuppc box.

Ive been working on this for the last 4 hours and everything on google
seems to point to bad ssh keys, which seems to be working, just not
from backuppc.

Thanks for any help you can give.

--
Are you an open source citizen? Join us for the Open Source Bridge conference!
Portland, OR, June 17-19. Two days of sessions, one day of unconference: $250.
Need another reason to go? 24-hour hacker lounge. Register today!
http://ad.doubleclick.net/clk;215844324;13503038;v?http://opensourcebridge.org
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] how does backuppc handle DBs

2009-02-07 Thread Nick Smith
ok, i use volume shadow so i dont need to dump.
what does it do for the backup after you have dumped the DB?
does it only take what has changed since the
last dump or does it take the entire dump?

On Fri, Feb 6, 2009 at 10:26 PM,  m...@ehle.homelinux.org wrote:
 I make my mysql servers dump their DB's to a file before running BackupPC.

 Quoting Nick Smith nick.smit...@gmail.com:

 On Fri, Feb 6, 2009 at 9:09 PM, Nick Smith nick.smit...@gmail.com wrote:
 I have a couple clients backup nightly over the internet.
 I seem to be doubting if the backups are actually being completed
 successfully.
 Backuppc tells me they are, but the reason im questioning is because
 it is suppose
 to backup 10gigs a night (their sql db changes daily) and backup
 completes in 40mins
 over a max connection of 600kb.  The backup results do say its getting
 about 50%
 compression and the backup is only 5gigs when it complete.

 How does backuppc handle databases? does it take just whats changed
 or if the DB
 changes does it have to backup the entire thing again?

 ive gotten 14 complete full backups (one a day) and they all have the
 same results
 completed successfully and are about 10gigs before compression and pooling.

 its there a way i can make sure that things are working as planned?

 i mean without doing a restore and testing what its backing up.

 i just dont want to think im getting good backups when im really not.

 im using scripts that use volume shadow to get backups of their
 windows servers
 after hours.

 it just seems like the backups are going way to fast with the speed of
 the connection
 and the size of the data.

 thanks for any light you can shed on the subject.

 and to add to my own post im doing these backups over a pfsense
 firewall on both sides and the stats for the last 4 hours
 (when the 10gig backup was supposedly preformed), the output
 transmitted on the WAN was only 50megs. so something
 isnt working correctly here.

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the
 power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/





 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] how does backuppc handle DBs

2009-02-07 Thread Nick Smith
Well please correct me if my thinking is not correct.

im using volume shadow, which is suppose to be a way to backup in use
files without unmounting or corrupting them.

the volume shadow and backups are being done after hours at a time
when no one is logged on and making any changes to the database.

so from what ive read and understand this is a safe way to backup sql
and exchange.

i could edit the pre/post scripts that launches the volume shadow to
unmount the sql and exchange dbs of you guys think that would be
a safer way.  it just adds a bit of complexity to a relatively easy
script. and i am no programmer, i thought i was lucky to get this far.

I do understand the fact that an untested backup is no backup at all,
i just need to figure out a good time to bring a doctors office down
thats two hours away to do the testing.

I am doing ony full backups with rsyncd over a site to site vpn
connection.  i think what you are saying is that rsync will do a
compare on the
original file and only transfer whats changed correct?  someone else
on here told me that repeated full backups do pretty much the same as
an incremental with a little more cpu overhead and slightly increased
disk usage, does that still stand as correct?

if it is actually only transmitting the changes in the db file, then i
could see how the backup was only taking 40mins.  i just dont
understand
why it tells me im backing up 10gigs.  unless its just telling me the
size of the file is 10gigs and its telling me its backed up.

thanks for the posts any other input would be much appreciated.

On Sat, Feb 7, 2009 at 10:53 AM, David Morton morto...@dgrmm.net wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Nick Smith wrote:
 ok, i use volume shadow so i dont need to dump.
 what does it do for the backup after you have dumped the DB?

 That doesn't necessarily do the job - the database may be in the middle
 of writing a change and the shadow copy could pick it up in an
 inconsistent state.

 To back up a database, you have to read the documentation for that
 database. For example, a google search for mysql backup gets this for
 a first hit:

 http://dev.mysql.com/doc/refman/5.1/en/backup.html

 YMMV for other databases.

 At the very least, you may need to issue commands to flush tables and do
 a read lock while you start the shadow snapshot, and then release the locks.

 OTOH, doing a dump (such as mysqldump for mysql) usually means the dump
 itself is in a consistent state, and backing up that file means you have
 a consistent backup.  While your backuppc back may pick that up as well
 as the binary database files, you don't know for sure if the dtatabase
 files are consistent.

 The other good thing about a dump is that it can be loaded on newer
 versions of the database in case of a rebuild - I've heard horror
 stories of when someone tried to restore a binary database file after a
 crash, only to have it fail because the database versions didn't match.

 Ultimately, the question you want answered is, Can I restore the
 database files and have them work.  Well, the answer to that is, of
 course, Test it!  An untested backup is no backup at all.

 As for the amount of data transfered, if you are doing a full backup, it
 should transmit the entire file, but if you are using rsync and doing an
 incremental, it should only transmit what has changed within a file.
 tar and smb have to transmit a changed file in its entirety I belive.
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.9 (GNU/Linux)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

 iEYEARECAAYFAkmNro0ACgkQUy30ODPkzl2mqwCdFodlm4g2rzuZ/SeXMqc667TE
 9/sAoLKeH1A5YpKK0H+1cD8VvxJX523M
 =M6sn
 -END PGP SIGNATURE-

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com

Re: [BackupPC-users] how does backuppc handle DBs

2009-02-06 Thread Nick Smith
On Fri, Feb 6, 2009 at 9:09 PM, Nick Smith nick.smit...@gmail.com wrote:
 I have a couple clients backup nightly over the internet.
 I seem to be doubting if the backups are actually being completed 
 successfully.
 Backuppc tells me they are, but the reason im questioning is because
 it is suppose
 to backup 10gigs a night (their sql db changes daily) and backup
 completes in 40mins
 over a max connection of 600kb.  The backup results do say its getting
 about 50%
 compression and the backup is only 5gigs when it complete.

 How does backuppc handle databases? does it take just whats changed or if the 
 DB
 changes does it have to backup the entire thing again?

 ive gotten 14 complete full backups (one a day) and they all have the
 same results
 completed successfully and are about 10gigs before compression and pooling.

 its there a way i can make sure that things are working as planned?

 i mean without doing a restore and testing what its backing up.

 i just dont want to think im getting good backups when im really not.

 im using scripts that use volume shadow to get backups of their windows 
 servers
 after hours.

 it just seems like the backups are going way to fast with the speed of
 the connection
 and the size of the data.

 thanks for any light you can shed on the subject.

and to add to my own post im doing these backups over a pfsense
firewall on both sides and the stats for the last 4 hours
(when the 10gig backup was supposedly preformed), the output
transmitted on the WAN was only 50megs. so something
isnt working correctly here.

--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] how does backuppc handle DBs

2009-02-06 Thread Nick Smith
I have a couple clients backup nightly over the internet.
I seem to be doubting if the backups are actually being completed successfully.
Backuppc tells me they are, but the reason im questioning is because
it is suppose
to backup 10gigs a night (their sql db changes daily) and backup
completes in 40mins
over a max connection of 600kb.  The backup results do say its getting
about 50%
compression and the backup is only 5gigs when it complete.

How does backuppc handle databases? does it take just whats changed or if the DB
changes does it have to backup the entire thing again?

ive gotten 14 complete full backups (one a day) and they all have the
same results
completed successfully and are about 10gigs before compression and pooling.

its there a way i can make sure that things are working as planned?

i mean without doing a restore and testing what its backing up.

i just dont want to think im getting good backups when im really not.

im using scripts that use volume shadow to get backups of their windows servers
after hours.

it just seems like the backups are going way to fast with the speed of
the connection
and the size of the data.

thanks for any light you can shed on the subject.

--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] multiple pre/post commands

2009-02-02 Thread Nick Smith
Is it possible to use more than on pre/post command per host?
Would they need to be separated by a semi-colon or comma or similar?

Thanks for any help, i couldnt find the info in the docs or from google.

--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple pre/post commands

2009-02-02 Thread Nick Smith
What would the script look like?

The two commands are:
/etc/backuppc/preusercmd-host1.sh $host
/etc/backuppc/startbkupemail.sh $user $xferOK $host $type $cmdType

the latter one has from what ive played with it, has to be run from
the predump command or the variables backuppc defines dont get picked
up for some reason.

any ideas?

On Mon, Feb 2, 2009 at 7:16 PM, Rob Owens row...@ptd.net wrote:
 On Mon, Feb 02, 2009 at 06:47:35PM -0500, Nick Smith wrote:
 Is it possible to use more than on pre/post command per host?
 Would they need to be separated by a semi-colon or comma or similar?

 Thanks for any help, i couldnt find the info in the docs or from google.

 I'm not sure, but couldn't you just make a script that runs all your 
 commands, and call that script as a single pre/post command?

 -Rob

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple pre/post commands

2009-02-02 Thread Nick Smith
Ive tried putting ; in between the two commands, ive tried  and 
as well, with  in there, it seems to run both commands but the
variables arent being pulled from backuppc, so the email doesnt work
correctly.  and also the script that runs my vshadow commands doesnt
seem to be working correctly either now that ive checked it.
ive also tried hand editing the config.pl file and using  around
each command and that doesnt work either, and usingand ;

im kinda stumped here.

On Mon, Feb 2, 2009 at 7:53 PM, Rob Owens row...@ptd.net wrote:
 In bash, you can make your script accept arguments.  See this link:  
 http://www.ibm.com/developerworks/library/l-bash2.html

 I'm not sure if this works or not for the backuppc user, because his shell is 
 sh.  Maybe someone else can advise.

 -Rob

 On Mon, Feb 02, 2009 at 07:33:07PM -0500, Nick Smith wrote:
 What would the script look like?

 The two commands are:
 /etc/backuppc/preusercmd-host1.sh $host
 /etc/backuppc/startbkupemail.sh $user $xferOK $host $type $cmdType

 the latter one has from what ive played with it, has to be run from
 the predump command or the variables backuppc defines dont get picked
 up for some reason.

 any ideas?

 On Mon, Feb 2, 2009 at 7:16 PM, Rob Owens row...@ptd.net wrote:
  On Mon, Feb 02, 2009 at 06:47:35PM -0500, Nick Smith wrote:
  Is it possible to use more than on pre/post command per host?
  Would they need to be separated by a semi-colon or comma or similar?
 
  Thanks for any help, i couldnt find the info in the docs or from google.
 
  I'm not sure, but couldn't you just make a script that runs all your 
  commands, and call that script as a single pre/post command?
 
  -Rob
 
  --
  Create and Deploy Rich Internet Apps outside the browser with 
  Adobe(R)AIR(TM)
  software. With Adobe AIR, Ajax developers can use existing skills and code 
  to
  build responsive, highly engaging applications that combine the power of 
  local
  resources and data with the reach of the web. Download the Adobe AIR SDK 
  and
  Ajax docs to start building applications 
  today-http://p.sf.net/sfu/adobe-com
  ___
  BackupPC-users mailing list
  BackupPC-users@lists.sourceforge.net
  List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
 

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of 
 local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backup size confusion

2009-01-22 Thread Nick Smith
I recently changed backup servers, so im starting from scratch on the
pool with a new machine.
Ive configured it just like the old one and i have the old one around
to compare from.

the problem i am having is that the backups on the new machine are
ALOT smaller than on the first.
the client has some large 7gig+ sql databases that take forever to
backup over the wan.
they finished in one night (compared to almost a week on the old machine)

on the old server the host summary shows 13.21gb and on the new server
the same full backup says
its only 4.52gb.  so i went to the console on each machine and did a
du -sh on each server for the
/var/lib/backuppc/pc/machine and on the old server the dir was 9.5gb
and the new one its 5.2gb.

so then i went to the actual machine and looked up the size of the
actual db on the machine.
the size on the actual clients machine is 5.6bg for one db.
when i browse the backups in the web interface, i browsed to where the db is:
/mssql/data/db.mdf and tried to download/restore that file and it
saved a 768kb file. not the 5.6gb db file,
but the name was correct, but size is drastically off.

this has be really concerned that the backups arent working the way
they are suppose to.  its a new server
there are no files in the pool, the backups are taking WAY less time
then they should and trying to restore
a file the size is WAY off.

in the host summary under that backup client, the new files says
size 4427.8 comp/mb 777.0   comp 82.5%
thats a heck of a compression ratio.  i know that sql dbs can compress
down nicely, but im comparing to the old
server and its not the same. the first full backup on the old server
was size 13119.9comp/mb 6248.3  comp 52.4%
and nothing has changed on the client machine.

so something isnt doing right.  and the fact that when i restore one
of their large db files its only 768kb instead of the 5.6gb its
suppose to be.

Where can i start to look and see what the problem was? ive checked
the logs on the server and client and there are no errors,
the backup says it completed successfully.  what else can i do/check?
this has really got me worried.

thanks for any help you can give.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup size confusion

2009-01-22 Thread Nick Smith
On Thu, Jan 22, 2009 at 9:12 AM, Les Mikesell lesmikes...@gmail.com wrote:
 Nick Smith wrote:

 in the host summary under that backup client, the new files says
 size 4427.8   comp/mb 777.0   comp 82.5%
 thats a heck of a compression ratio.  i know that sql dbs can compress
 down nicely, but im comparing to the old
 server and its not the same. the first full backup on the old server
 was size 13119.9  comp/mb 6248.3  comp 52.4%
 and nothing has changed on the client machine.

 so something isnt doing right.  and the fact that when i restore one
 of their large db files its only 768kb instead of the 5.6gb its
 suppose to be.

 Where can i start to look and see what the problem was? ive checked
 the logs on the server and client and there are no errors,
 the backup says it completed successfully.  what else can i do/check?
 this has really got me worried.

 Is this marked as a partial - perhaps it didn't really complete.  Look
 through the errorlog to see if the database file is mentioned.  Also, is
 this a live, changing database file or a dumped copy?  If it is live, a
 restored copy isn't likely to work anyway.

 --
   Les Mikesell
lesmikes...@gmail.com
It looks like it actually did fail, but the web interface said it
completed as was green

MSSQL7/Data/sw_exch_Log.LDF got digests
c586c1b3df5ef4140a5b1c248765e05c vs c586c1b3df5ef4140a5b1c248765e05c
  create   644   400/401 1048576 MSSQL7/Data/sw_exch_Log.LDF
attribSet: dir=fthddb/MSSQL7/Data exists
attribSet(dir=fthddb/MSSQL7/Data, file=sw_exch_Log.LDF, size=1048576,
placeholder=)
Read EOF:
Tried again: got 0 bytes
Got done from child
Got stats: 769 1678 0 0 ('errorCnt' = 0,'ExistFileSize' =
93365724,'ExistFileCnt' = 25,'TotalFileCnt' = 31,'ExistFileCompSize'
= 16058575,'TotalFileSize' = '4736263644')
finish: removing in-process file MSSQL7/Data/sw_image_Data.MDF
attribWrite(dir=fthddb/MSSQL7/Data) -
/var/lib/backuppc/pc/thddb/new/fthddb/fMSSQL7/fData/attrib
attribWrite(dir=fthddb/MSSQL7) -
/var/lib/backuppc/pc/thddb/new/fthddb/fMSSQL7/attrib
attribWrite(dir=fthddb) - /var/lib/backuppc/pc/thddb/new/fthddb/attrib
attribWrite(dir=) - /var/lib/backuppc/pc/thddb/new//attrib
Child is aborting
Got exit from child
Done: 31 files, 4736263644 bytes
Executing DumpPostUserCmd: /etc/backuppc/postusercmd-thddb.sh thddb
Rsync and shadow copy unloaded

Im using volume shadow to back up the databases, so they are not live.
(or in use)

I restarted another full backup to see if it completes this time.  Ive
just never seen it say it completed when it really didnt.

the sw_image_Data.MDF referenced is the large DB file that had the
way smaller size, which was because it didnt complete
the backup.

Hopefully the next full will work.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Creating shadow copies in windows

2009-01-22 Thread Nick Smith
On Wed, Jan 14, 2009 at 9:09 AM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
 Kevin Kimani wrote at about 11:21:44 +0300 on Wednesday, January 14, 2009:
   Hi all,
  
   Have been trying to use the script posted for copying open files in windows
   but have not been able to fully make it run. Could someone who has used the
   script help me and let me know how to make it work. I will really
   appreciate.
  
   Kind Regards
  
   Kevin
  
 I think I'm the author of the script but (unfortunately) your email
 doesn't provide any details. You say only that you have not been
 able to fully make it run and that you want to know how to make it
 work. I do not know how to answer such an open-ended  question.


I dont know about the scripts you are using, but i found this blog:

http://www.goodjobsucking.com/?p=62

and ive been able to successfully get backups of shadow copies using his method.

Might be worth a look.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Excluding Temporary Internet Files to avoid logerrors

2009-01-22 Thread Nick Smith
I had that problem with a couple client servers that i back up, i
added an exclude in the
web interface that skips them,
for they key i added *
and for the BackupFilesExclude i used /Documents and Settings/*/Local
Settings/Temporary Internet Files
i also added another one for /Documents and Settings/*/Local Settings/Temp

that seemed to take care of the problem.

On Fri, Jan 16, 2009 at 11:32 AM, Chris Baker cba...@intera.com wrote:

 The temporary Internet files are hardly that. They are permanent files and
 are simply a cache of every single web page you visit. They accumulate over
 time and are a real pain.

 It is a good idea to delete all these files on a regular basis in every
 profile on your PC. I usually go into the content.ie5 folder and delete all
 the folders in there. I leave the index file there.

 I once worked on a computer that had about 5 GB of Internet cache. This
 large storage of cache is also a hideout for spyware and other things that
 simply bog down your PC. After I deleted all this cache, the PC started to
 perform much better.


 Chris Baker -- cba...@intera.com
 systems administrator
 INTERA -- 512-425-2006

 -Original Message-
 From: Juergen Harms [mailto:juergen.ha...@unige.ch]
 Sent: Friday, January 16, 2009 10:13 AM
 To: General list for user discussion, questions and support
 Subject: [BackupPC-users] Excluding Temporary Internet Files to avoid
 logerrors

 I just included an additional share to be included into the backups on an XP
 machine (c/Documents and Settings/Juergen - my own directory on my wifes PC,
 backups are done by the user Juergen) and got flooded with messages in the
 error log of the type:

 Remote[1]: rsync: readlink_stat(Local Settings/Temporary Internet
 Files/Content.IE5/1F5UL...snip...SQWQ.jpg (in docs-juergen)) failed:
 File name too long (91)

 Checking some time later, I did not find these files. I assume that these
 files are temporarily created during the backup process.

 Question 1: is this right?

 I then did some googling and found a reference to the BackupPC wiki which
 gives valuable suggestions for exclude files:

 http://backuppc.wiki.sourceforge.net/Common_backup_excludes

 - but I had not found the corresponding item in the index of the wiki (left
 frame).

 Question 2: is this a problem of my browser (I had tried with firefox and
 with konqueror) or is there a problem in the wiki, for instance that it cuts
 off the lower part of the index in the left frame?

 I had planned to add exclude lists some time later - that is apparently an
 important and urgent issue.

 
 --
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



 --
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup size confusion

2009-01-22 Thread Nick Smith
On Thu, Jan 22, 2009 at 11:52 AM, Les Mikesell lesmikes...@gmail.com wrote:
 Nick Smith wrote:
 
 Im using volume shadow to back up the databases, so they are not live.
 (or in use)

 Is there some reason to think the database is in a consistent state when
 the volume shadow is made?


From what i know about VSS, the makes a copy of the live DB so it can be
copied safely, so you dont get in use errors.

Do you have other info on this that i need to know?

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup size confusion

2009-01-22 Thread Nick Smith
On Thu, Jan 22, 2009 at 12:53 PM, Nick Smith nick.smit...@gmail.com wrote:
 On Thu, Jan 22, 2009 at 11:52 AM, Les Mikesell lesmikes...@gmail.com wrote:
 Nick Smith wrote:
 
 Im using volume shadow to back up the databases, so they are not live.
 (or in use)

 Is there some reason to think the database is in a consistent state when
 the volume shadow is made?


 From what i know about VSS, the makes a copy of the live DB so it can be
 copied safely, so you dont get in use errors.

 Do you have other info on this that i need to know?


Also, the backups dont start until 7pm, 2 hours after they close, so no one is
at the office working to change the DB (if thats what you are talking about when
you say consistent)

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] changing backup order

2009-01-21 Thread Nick Smith
Is there a way to set the order in which backuppc backs up machines?

I need to make sure that 2 certain machines never start backups at the
same time.

Is this possible?

Thanks for the help.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] restore questions

2009-01-19 Thread Nick Smith
I use backuppc to backup several windows servers across the internet.
Some of which are rather large and on slow internet connections.
With frequent disconnects or other random errors it has taken almost a
month to make a full backup of one client, but it finally happened
(40gigs of data, exchange etc).

My question is that,  since it takes so long to make a full backup and
the incrementals go rather quickly, is it safe to do a full backup
every
6 months or so and do incrementals every day?  How would this impact
restoring files?  If im doing incremental backups everyday and they
need to restore their exchange DB for some reason, would i still be
able to get it with only 1 full backup and the rest incrementals?
these are
all level 1 backups. From what ive read level one backups backup
changed from the last level 0 full backup.  Im currently keeping 2
weeks of
incrementals.  What if i wanted to only get a full every 12 months and
just use incrementals every day?  Am i still covered?

I just dont want to get into a situation where i think i have a good
backup and then they actually need to restore something and i dont
have
adequate data to restore.

Is there a better option to accomplish this?  Im using rsyncd on the
windows servers and have pre/post scripts that launch a volume shadow
of the drives to backup, then i use exclude/include in backuppc to
tell it what to backup.  Ive slimmed the backup down as much as
possible
to only get important files.  They are in a rural area and the
internet speed is as fast as they can get it.

Thanks for any help and/or advice.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] restore questions

2009-01-19 Thread Nick Smith
So is the way im doing it now going to work if i need to do a restore
down the road?
Would doing full backups be better than incrementals?

From what your saying it sounds like the full backups are actually
doing incrementals?
Or is it because most of the files are already in the pool so it just
does a hardlink instead?
If i tell it to only keep 1 full backup and only do a full every 6
months, once it does the next
full backup does it change the hardlinks to point to the latest full
and remove the first? or
does it keep the initial full just to keep the hardlinks in working order?

thanks

On Mon, Jan 19, 2009 at 12:42 PM, Jon Craig cannedspam.c...@gmail.com wrote:
 I think you will find that subsequent full backups will take just a
 little longer than the incrementals.  New full backups will only
 transfer changed files rather than a complete new backup.  The
 difference is in how the client identifies files to backup. With an
 incremental, files are 1st checked based on last modification date and
 then based on file signature.  This avoids a lot of signature work but
 can lead to missed files due to unarchiving activities (live
 extracting contents of zip files) that create files with modification
 times earlier than the last full. The full backup selects every file
 but then subjects it to signatute checks to see if the file has
 actually changed.  Only files that fail signature match are actually
 changed.  The signature is checked against the pool,  so it will only
 fail when this file has never been backed up by this particular
 instance on BackupPC.

 On 1/19/09, Nick Smith nick.smit...@gmail.com wrote:
 I use backuppc to backup several windows servers across the internet.
 Some of which are rather large and on slow internet connections.
 With frequent disconnects or other random errors it has taken almost a
 month to make a full backup of one client, but it finally happened
 (40gigs of data, exchange etc).

 My question is that,  since it takes so long to make a full backup and
 the incrementals go rather quickly, is it safe to do a full backup
 every
 6 months or so and do incrementals every day?  How would this impact
 restoring files?  If im doing incremental backups everyday and they
 need to restore their exchange DB for some reason, would i still be
 able to get it with only 1 full backup and the rest incrementals?
 these are
 all level 1 backups. From what ive read level one backups backup
 changed from the last level 0 full backup.  Im currently keeping 2
 weeks of
 incrementals.  What if i wanted to only get a full every 12 months and
 just use incrementals every day?  Am i still covered?

 I just dont want to get into a situation where i think i have a good
 backup and then they actually need to restore something and i dont
 have
 adequate data to restore.

 Is there a better option to accomplish this?  Im using rsyncd on the
 windows servers and have pre/post scripts that launch a volume shadow
 of the drives to backup, then i use exclude/include in backuppc to
 tell it what to backup.  Ive slimmed the backup down as much as
 possible
 to only get important files.  They are in a rural area and the
 internet speed is as fast as they can get it.

 Thanks for any help and/or advice.

 --
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


 --
 Sent from my mobile device

 Jonathan Craig

 --
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] restore questions

2009-01-19 Thread Nick Smith

 Other people have pretty much answered your questions already, but I'll
 pass on my experience with backuppc as well as an extra data point for you.

 I am using the older backuppc 2.1.2pl1 from Debian stable in a few places.

 Originally I was going to do a single full and forever incremental
 backups (since we didn't want to delete any backups ever anyway).
 However, I found the backup took longer and longer each time, because as
 new files were added, the incremental backup was transferring all the
 data of all new files/changed file compared to the original full. Of
 course, data storage on the backup server only stored a single copy of
 the files, even though they were transferred multiple times.

 This was when I changed to do a full every 3 days to reduce the
 bandwidth requirement, even though it increased the cpu/disk IO/time to
 complete the backup.

 If you want to minimise bandwidth, every backup would be a full (if
 using rsync/rsyncd).

 BTW, from what I know, in newer versions, each incremental is compared
 to the most recent backup of any lower level. So, you could do a level
 0, level 1, level 2, level 3 . level 9, level 0, level 1 etc...

 That way, you only transfer a small amount of data each day, but still
 get a quick backup time with the benefits of the incremental.

 You might also want to consider that you don't really lose anything by
 having lots of level, you still only restore the latest version of all
 files. It isn't like doing a level 0 tape, level 1 tape, etc... you
 simply restore what you want and backuppc automatically restores the
 latest version of each file.

 Regards,
 Adam

Thanks for taking the time to respond, I did learn something from your
post though,
i didnt know that didnt matter how many levels you have that backuppc would
automatically restore the latest version, I was under the assumption
that i would
have to do it like tape, starting with level 0 and go forward.  which
is why i was only
level 0 and 1 backups, to try and prevent that.

What do you think would be the harm in just doing full backups?
instead of doing a
full and then forever incrementals, if the full backup really only
backs up changed files
since the last full backup, i really wouldnt need to do incrementals
at all, and from what
i understand actually get shorter backups and use less bandwidth with
full backups, but
with more cpu/memory overhead.  its a dual xeon 3.2ghz machine with
2gigs of ram, so i think
that will be just fine, maybe in the future ill up the ram.

thanks again for your post.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Another (different) timeout question for rsyncd

2008-12-09 Thread Nick Smith
On Tue, Dec 9, 2008 at 11:01 AM, Jeffrey J. Kosowsky
[EMAIL PROTECTED] wrote:
 Jeffrey J. Kosowsky wrote at about 15:50:45 -0500 on Sunday, December 7, 2008:
   Some of my WinXP backups occasionally fail with the generic LOG error
   message:
12:00:30 Started incr backup on mymachine (pid=4983, share=c)
13:03:54 Backup failed on mymachine (Child exited prematurely)
   and the more helful XferLOG.bad.z message:
...
...
Sent exclude: /WINDOWS/system32/config/systempprofile/NTUSER.DAT
Sent exclude: /WINDOWS/system32/config/systempprofile/NTUSER.DAT.LOG
Xfer PIDs are now 10851
Remote[1]: rsync error: timeout in data send/receive (code 30) at
/home/lapo/packaging/rsync-3.0.4-1/src/rsync-3.0.4/io.c(237) 
 [sender=3.0.4]
Read EOF:
Tried again: got 0 bytes
Child is aborting
Parent read EOF from child: fatal error!
Done: 0 files, 0 bytes
  
   From googling, it seems like the problem is due to an rsyncd
   timeout. This timeout seems to have occurred 63 minutes into the
   backup but before any (new) files were written.
  
   Currently I have timeout=600 set in my rsyncd.conf file.
   I could increase the timeout value, but I wanted to know what is the
   best way to do that?
   - Is it enough to just increase the timeout on the rsyncd server?
   - Is it better to set the timeout on the BackupPC server side?
   - Or would it be better to leave out timeout altogether from
 rsyncd.conf and have it set elsewhere or by default?
   - Is there a way to handle timeouts more dynamically rather than just
 picking a fixed number based on worst case or something?
   - What values are other people using?
  

 Removing the timeout line from rsyncd.conf seemed to solve my problem.


Whats going to happen if you get a stalled backup?  Will it just sit
there forever holding up your other backups?
I to get those errors, but my timeout is about 6 days (worst case) but
the error happens way before the timeout is hit.
Do you think your timeout was just too small of a window?

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] scheduling question

2008-12-08 Thread Nick Smith
I have some clients that take 3 days or better to do a full backup, i
would like to do 1 full backup a month, and 1 incr backup every day,
and keep 1 week worth of incr and only 1 full backup.  What happens
when there is still a full backup going on and i schedule and incr
everyday?  will it wait until the full is done or will it try to do an
incr with a full backup still in progress?  I would assume that the
incr would fail if it doesnt have a full backup to work off of
correct?  Is there a certain way im suppose to schedule the backups to
accomplish what i want to do?
Also, im working with a machine with 2gigs or ram 800mhz cpu P3.  The
machines im backing up have pretty large sql db's 7 to 12 gigs, will
this machine be able to handle that doing all the compression with
rsync and over ssh?  From some of the other posts ive read on this
list, you can run into problems if the machine runs out of ram, even
though it has a swap partition.
Thanks for any input.

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] removing all backups

2008-12-07 Thread Nick Smith
Is there a way to clear out all backups on the system?  Ive been
testing with backuppc, and want to start from scratch again.  I want
to keep the hosts and scripts ive setup, but i want to clear out all
backed up files and start over.  If i remove the pc and
/var/lib/backuppc/cpool directorys will they be recreated when a new
backup is started?  What would be the easiest way to do this?  And i
dont want to wait for the nightly process to run.

Thanks for your help.

-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] help with BackupFilesExclude

2008-12-05 Thread Nick Smith

 I declare the exclude list within the GUI of backuppc.

 Declare the RsyncShareName in the same manner as they are declared in your
 rsyncd.conf in your windows client.
 Define BackupFilesExclude:
 NewKey = * if it should applicable to all RsyncShareName or the
 RsyncShareName to which it should applicable.
 Within the Key you can add the directories relativ to the RsyncShareName in
 Linux syntax.
 e.g.:
 $Conf{RsyncShareName} = [
  'D',
  'C'
 ];
 $Conf{BackupFilesExclude} = {
  'C' = [
'/WINDOWS/Downloaded Program Files',
'/WINDOWS/Offline Web Pages',
'/WINDOWS/Temp',
'/WINDOWS*.log',
'/proc'
  ],
  '*' = [
'pagefile.sys',
'hiberfil.sys',
'/System Volume Information',
'/RECYCLER',
'/$Recycle.bin',
'/$RECYCLE.BIN',
'/proc',
'/Windows',
'Temporary*',
'/Program Files'
  ]
 };

 br
 Matthias
 --
 Don't panic

Can you add files like this the same way you can if you create a
--exclude-from=filter.txt in the RsyncArgs?
For instance, if you create the file filter.txt and had the lines:
+ *.jpg
+ *.JPG
+ */
- *
It would add the files *.jpg and not exclude them  (since
backuponlyfiles has limitations) and then it would exclude everything
else (-*)

So in your example, would it work if i put:
+ '/Program Files'
to include the program files directory and not exclude it?

Thanks for your help

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child Exited Prematurely

2008-12-04 Thread Nick Smith
2008/11/18 James Sefton [EMAIL PROTECTED]:
 Hi,



 Please excuse me if I am using this wrong, in all my years in IT, it seems
 this is the first time I have used a mailing list for support.  (I'm usually
 pretty good at the whole RTFM thing)



Did you ever get this resolved? Im having the same problem, now all of
my backups are failing with the same errors you are getting.  Im using
2.6.9 protocol version 29.  Ubuntu doesnt seem to have a newer version
available yet.  They are all on fiber or fast cable internet that is
reliable.  Different firewalls at each location.  I could never find
any info if pfsense or m0n0wall close inactive connections.

Thanks for any input.

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child Exited Prematurely

2008-12-04 Thread Nick Smith
On Thu, Nov 20, 2008 at 8:28 PM, Craig Barratt
[EMAIL PROTECTED] wrote:
 James writes:

 The problem we are seeing is that Backups are randomly failing.
 The log file on BackupPC showing something like this:

 This is most likely a TCP timeout or other network problem.

 Rsync added a TCP keep-alive option in protocol version 29
 (if I recall correctly) and is not currently supported in
 File::RsyncP that BackupPC uses.

 Craig


Is there a work around for that?  If it really is a timeout issue,
what can be done to fix it?

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Child Exited Prematurely

2008-12-04 Thread Nick Smith
On Thu, Dec 4, 2008 at 10:55 PM, Adam Goryachev
[EMAIL PROTECTED] wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Rsync added a TCP keep-alive option in protocol version 29
 (if I recall correctly) and is not currently supported in
 File::RsyncP that BackupPC uses.

 Is there a work around for that?  If it really is a timeout issue,
 what can be done to fix it?

 AFAIK, ssh offers it's own keepalive which would work.

 Regards,
 Adam

 - --
 Adam Goryachev
 Website Managers
 www.websitemanagers.com.au

I have TCPKeepAlive yes which was set by default in my sshd_config
on the backuppc server(ubuntu), is this what you are talking about?
It was on by default, so maybe there is another keep alive setting im
missing?

--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup PC and backup progress

2008-10-09 Thread Nick Smith
On Tue, Oct 7, 2008 at 12:54 PM, Nick Smith [EMAIL PROTECTED] wrote:
 one thing i use that i recently figured out is the lsof command in
 linux, it lists all the open files that the system has open.
 so i run a command like lsof | grep /storage/backuppc and it will list
 all the files currently open and filter them by the path
 that my backups go to, because from what ive found there is no
 progress meter of any kind and all i see is network activity,
 so i run that command and it will show me what file its currently
 working on.  large files like SQL databases can take forever to
 backup. (over the internet anyway)

 HTH

 On Tue, Oct 7, 2008 at 11:10 AM, Seann Clark [EMAIL PROTECTED] wrote:
 All,

I am new to the list and rather new to BackupPC. So far I have had
 good experience with the program, and like what I see so far with using
 it. The backups are easy, and I have in mind to tune it in with AMANDA
 to tape backup the old archives, but I haven't gotten that far with that
 portion yet, since I am still working on total size I need for my
 network. I am wondering if anyone has any other idea's on a physical
 storage medium that is used. I am sure DVD's, CD's and maybe Blueray
 (expensive still but 25GB a single layer disc isn't bad for a backup).

Another question I have is is there any good way to have a backup
 progress estimate so I know what the status of the backup is. This would
 be handy for systems (like my laptop) that don't seem to backup, but yet
 the backup still is marked as 'in progress' for days, even if the laptop
 is powered off. I haven't ready anything about fail thresholding, and
 since I am using SMB for the laptops, I am sure I could get a file
 count, and then base off the file count a basic estimate of progress.
 Has anyone tried this with the program?

I have scoured the WIKI for tricks and tips like these, but alas,
 haven't found them yet. Any feedback would be appreciated.


There is also another program i found BackupPC_zcat that will let you
look through the Xferlog.z file in the /path/to/storage/pc/Xferlog.Z
file
which might let you know whats going on as well. on ubuntu its at
/usr/share/backuppc/bin/BackupPC_zcat
regular zcat will not work for this file.

HTH

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backing up large SQL DB's

2008-10-09 Thread Nick Smith
I am using the volume shadow copy to backup large (12gig+) sql db's.
After the first full backup, and things are changed/added to the DB,
is it going to pull down the entire DB again or will it just download
the changes
(if thats possible)

thanks for the input.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up large SQL DB's

2008-10-09 Thread Nick Smith
Do you know how it does that? I thought it just checked checksums to
see if they were different?  How can it just download the changes to a
large DB like that if its all contained in a single file DB?

thanks

On Thu, Oct 9, 2008 at 3:38 PM, David Rees [EMAIL PROTECTED] wrote:
 On Thu, Oct 9, 2008 at 11:06 AM, Nick Smith [EMAIL PROTECTED] wrote:
 I am using the volume shadow copy to backup large (12gig+) sql db's.
 After the first full backup, and things are changed/added to the DB,
 is it going to pull down the entire DB again or will it just download
 the changes
 (if thats possible)

 As long as the file names remain the same and you are using rsync,
 only the changes will be downloaded.

 -Dave

 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup PC and backup progress

2008-10-07 Thread Nick Smith
one thing i use that i recently figured out is the lsof command in
linux, it lists all the open files that the system has open.
so i run a command like lsof | grep /storage/backuppc and it will list
all the files currently open and filter them by the path
that my backups go to, because from what ive found there is no
progress meter of any kind and all i see is network activity,
so i run that command and it will show me what file its currently
working on.  large files like SQL databases can take forever to
backup. (over the internet anyway)

HTH

On Tue, Oct 7, 2008 at 11:10 AM, Seann Clark [EMAIL PROTECTED] wrote:
 All,

I am new to the list and rather new to BackupPC. So far I have had
 good experience with the program, and like what I see so far with using
 it. The backups are easy, and I have in mind to tune it in with AMANDA
 to tape backup the old archives, but I haven't gotten that far with that
 portion yet, since I am still working on total size I need for my
 network. I am wondering if anyone has any other idea's on a physical
 storage medium that is used. I am sure DVD's, CD's and maybe Blueray
 (expensive still but 25GB a single layer disc isn't bad for a backup).

Another question I have is is there any good way to have a backup
 progress estimate so I know what the status of the backup is. This would
 be handy for systems (like my laptop) that don't seem to backup, but yet
 the backup still is marked as 'in progress' for days, even if the laptop
 is powered off. I haven't ready anything about fail thresholding, and
 since I am using SMB for the laptops, I am sure I could get a file
 count, and then base off the file count a basic estimate of progress.
 Has anyone tried this with the program?

I have scoured the WIKI for tricks and tips like these, but alas,
 haven't found them yet. Any feedback would be appreciated.



 ~Seann

 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backuppc questions

2008-10-07 Thread Nick Smith
Ive been running backuppc for a couple weeks now with mixed results,
mainly network speed issues which really isnt backuppc's fault, but
ive come up with some questions along the way.

what happens when a backup hasnt finished by the blackout period of
the next day?

how come when i start a full backup on a machine that runs overnight
and hasnt finished
in the morning yet, it says i have 1 incr backup when it hasnt
finished the first full backup yet?
this is the first time the machine has been backed up, no prior full
backups complete.
how does it get an incr backup with no full backup? it has nothing to
compare to.

$Conf{BackupPCNightlyPeriod} = 1
how many days it takes to traverse the entire pool, what does this
mean? how many days it takes to complete a backup? or is this for file
verification purposes?  I run scripts to create a VS of an entire
drive which i backup instead of the actual drive, would this screw
things up if i split it into days?  this one particular client has a
large 12 gig SQL DB that is taking forever to backup, ive had to bump
up the timeout period and start over, im hoping it completes in the
next couple days. (backing up over internet)

Does anyone here use hamachi?  what is the free versions bandwidth
speed?  I seem to only be getting about 90KB incoming and we are both
on fast cable connections.  Does anyone know what the speed is on the
commercial version? Is it faster? Ive got a ticket submitted to
logmein right now, but id like some info from people that use it for
backups.

I am backing up over the internet, and the big backups are taking alot
longer than i expected, am i going about it the correct way?  what is
the recommended way to backup large amounts of data over the internet?
 currently, i use this method http://www.goodjobsucking.com/?p=62 it
works great, dont have to install rsync, it launches the VS to make a
shadow of the drive and then backs it up, and cleans up when its
complete.  Its just that the backups are almost to the 100's of
gigabytes, so its taking an extreme amount of time.  Im assuming that
once i get the first full backup the incr ones will go much faster.  I
am hoping anyway.

It doesnt seem to be affecting the clients server performance to have
the backup still going during the day, but at this rate it seems like
its going to take over a week per client to backup. which seems a bit
long to me.  Im only doing one backup at a time to try and make it go
faster, it just seems like there is a bottleneck somewhere.

thanks for any advice you can give.


-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] suffering from the backups have stopped syndrome

2008-10-07 Thread Nick Smith
Im running into the same thing on ubuntu, from what ive read it could
be a timeout issue, ive set my time out extremely high to see if i can
get it to work, trying to backup a 12gig DB over the internet when i
get this error.  (actually trying to get the entire machine which
includes the DB) so a total of about 30gigs.  Im hoping its just a
timeout issue and if i can get a full backup the incr backups wont
take near as long.  the backup ran for a couple days before i got that
error though.

2008/10/6 Terri Kelley [EMAIL PROTECTED]:
 On Oct 4, 2008, at 6:57 PM, Terri Kelley wrote:

 On Oct 4, 2008, at 9:58 AM, Martin Leben wrote:

 Terri Kelley wrote:

 List,

 I have had a backup running successfully running of a server for some

 time. It uses automysqlbackup. Looking at the directory where that is

 stored on the server to be backed up, that is still being executed by

 BackupPC. However, the backup itself just stays running in the host

 summary. Looking at the log file in backup, I see the following:

 2008-10-01 20:00:03 full backup started for directory

 /home/backuppc/test (baseline backup #54)

 2008-10-02 11:11:55 Aborting backup up after signal INT

 2008-10-02 11:11:56 Got fatal error during xfer (fileListReceive

 failed)

 2008-10-02 20:00:03 full backup started for directory

 /home/backuppc/test (baseline backup #54)

 2008-10-03 16:00:04 Aborting backup up after signal ALRM

 2008-10-03 16:00:05 Got fatal error during xfer (fileListReceive

 failed)

 2008-10-03 20:00:03 full backup started for directory

 /home/backuppc/test (baseline backup #54)

 There have been no changes to this server so I don't understand why

 it

 would stop backing up. Anyone have a clue or pointers for things to

 look

 at?

 Terri Kelley

 Network Engineer


 Hi,

 On the backed up machine, are really you sure that someone hasn't

 changed or

 removed the private part of the key which backuppc uses when

 connecting?

 /Martin

 Martin,

 I just tried ssh from the backuppc server to the two servers being
 backed up as the backup user. Both worked without password etc.

 Terri

 No one have any further ideas on this? For further info, all are current
 versions of Centos and on a gigabit ethernet network.
 Terri

 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great
 prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/





-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backing up remote servers over wan

2008-09-30 Thread Nick Smith
Ive been playing with different options with using backuppc to backup
windows servers remotely over the internet, ive found a very good
how-to on a blog that seems like a very nice solution, but im having
problems at every step of the way and im hoping you guys can answer a
couple questions for me.

First, can backuppc backup across the internet? Is the rsync port the
only port needed to open for the backup?  What do you do if there are
multiple servers at one location on one ip address? Port forward?
If you care to check out the method im trying to get working its at:
http://www.goodjobsucking.com/?p=62
It uses winexe to execute remote commands on a windows machine to
initiate backups and shadow copies, and looks really slick if i could
only get it working.

Second question, has anyone, or does anyone use winexe with any of
their backup routines?  Im having problems with it connecting across
the internet.  I can get it working locally on the lan, but have yet
to get it working over the wan, even with no firewalls involved.

The reason i liked this method is that you dont have to install rsync
as a service, you just drop some files in a folder, make a user with
the right permissions to that folder and the winexe executes rsync
remotely and does the volume shadow and starts the backup.  it would
be wonderful, if i could actually get it working.

If there is a simplier solution that will work im up for that too, the
scripts arent too complicated, i just keep hitting road blocks at
every turn, the final one being i cant get winexe to work over the
internet.  has anyone else done this?

If im stuck with using a service, i guess ill have to go that route,
but id like to have the least amount of impact on the server, and
easiest deployment possible as i plan on configuring this on many
remote servers.

Thanks for any help and advice you can give.

-- 
Linux, because I'd rather own a free OS than steal one that's not
worth paying for.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/