Re: [BackupPC-users] rsync clients run out of memory

2009-11-20 Thread Andrew Libby

Hi Richard,


Richard Hansen wrote:
 Les Mikesell wrote:
 Richard Hansen wrote:
 Apparently not -- both of my clients already have rsync 3.0.5 installed, 
 yet rsync is causing them to run out of memory.  The clients have 4 to 6 
 million (largely redundant) files each.  It appears that I need the 
 incremental-recursion feature of protocol 30 to back up this many files.
 If they are grouped in several subdirectories, you could break the 
 backups into separate runs.  If they are all in one directory, even 
 protocol 30 probably won't help.
 
 The files are in several subdirectories.  I was contemplating breaking 
 up the run, but there's a complication:  The set of subdirectories 
 (underneath the only directory where it makes sense to split up the run) 
 changes over time.  It's a slow change, so maybe I can keep a sharp eye 
 out and manually adjust RsyncShareName as subdirectories are added and 
 removed (blech).

I've encountered a similar situation, a mail server.  It
uses maildir, which puts emails one per file so there are
lots of files.   So we've got like 7 or so million files on
a host.

I took and bind mounted user home directories  under
/var/bind_mounts/a/[a_username] to distribute the
files across several top level folders.  In reality we use
something like

0-9, a-c, d-h, etc.

Then each of those /var/bind_mounts/[foldername] are setup
as volumes to be backed up on the client. We've excluded
/var/bind_mounts from being backed up on the / volume.

All of this is accomplished using a script that's added
before and after the rsync call in the RsyncClientCmd and
RsyncClientRestoreCmd to build up and tear down the mounts.

We did it this way so it'd be hands off, we don't need to
worry what happens when users are added or removed because
the bind mount is setup just before the backups are taken.

This make any sense?


 
 Using tar as the xfer method would avoid the issue with the tradeoff 
 that you use more bandwidth for full runs and don't reflect changes 
 quite as accurately in increments.

Ultimately, we decided against this for the reasons you
mention.

 
 I've switched to tar for now, and I'm hoping that it will prove to be an 
 adequate solution.
 
 Thanks for your help,
 Richard
 

Best,

Andy

-- 

===
xforty technologies
Andrew Libby
ali...@xforty.com
http://xforty.com
===


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] change zimbra host name internal to zimbra

2009-07-14 Thread Andrew Libby



--
Enter the BlackBerry Developer Challenge  
This is your chance to win up to $100,000 in prizes! For a limited time, 
vendors submitting new applications to BlackBerry App World(TM) will have
the opportunity to enter the BlackBerry Developer Challenge. See full prize  
details at: http://p.sf.net/sfu/Challenge
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] [tangeis.com #3120] [Fwd: QuickBooks Data Services - Request ID 122212]

2009-07-14 Thread Andrew Libby


 Original Message 
Subject:QuickBooks Data Services - Request ID 122212
Date:   Tue, 14 Jul 2009 13:21:31 -0400
From:   quickbooksdataservi...@isupport.intuit.com,
icu_sta...@intuit.com
To: ali...@xforty.com



*PLEASE NOTE:* This e-mail was sent from an
auto-notification system
that cannot accept incoming e-mail. Please do not reply to
this message.
To send a response, log in to your account at
https://dataservices.intuit.com/sdcxuser/asp/login.asp or email
icu_sta...@intuit.com mailto:icu_sta...@intuit.com. Please
reference
your Request ID number noted below.

Tuesday, July 14, 2009

Request ID 122212

Dear Jack,

Thank you for your request for service. We have set up an
account for
you on our secure Intuit Data Services Support Site. Our
Support Center
will allow you to:

* Securely attach your data file to your service request.
* Monitor the status of your request.
* Enter comments or additional information about your
request.
* Cancel or close your request.
* Securely _download
  https://dataservices.intuit.com/sdcxuser/asp/login.asp_

(https://dataservices.intuit.com/sdcxuser/asp/login.asp) the
  repaired file.



_Logging In to Intuit Data Services_

   1. Log in to the Intuit Data Services

https://dataservices.intuit.com/sdcxuser/asp/login.asp online
  account
(https://dataservices.intuit.com/sdcxuser/asp/login.asp).

   2. Enter the following login information:

Username: ali...@xforty.com



Please note that both the user name and password are
case-sensitive.

To view your request click *My Requests* at the top of the
page. Click
view to the right of your request ID. *Fill out the form*,
then click
*update.*

To upload your file, go the bottom of your request form and
then click
*Click here for Attachment**s,* and follow the on screen
instructions.

To read instructions on using our Support Center click
*Instructions* at
the top of the page. You may want to print the instructions
for future
reference.

*Note: Please note that any bookkeeping work done in your
data file will
have to be manually reentered into the file that we return
to you. It is
not possible to merge files or move transactions from one
file to another.*

Thank you,

Intuit Data Services
*Request ID 122212*
2800 E. Commerce Center
Tucson, AZ 85706
Fax: (520) 844-6477
Monday through Friday, 6:00 A.M. to 4:00 P.M. Pacific time


--
Enter the BlackBerry Developer Challenge  
This is your chance to win up to $100,000 in prizes! For a limited time, 
vendors submitting new applications to BlackBerry App World(TM) will have
the opportunity to enter the BlackBerry Developer Challenge. See full prize  
details at: http://p.sf.net/sfu/Challenge
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rename RT stuff to be xforty instead of Tangeis

2009-07-10 Thread Andrew Libby
o RT Email subject
o Server
o Customer
o Other stuff?

--
Enter the BlackBerry Developer Challenge  
This is your chance to win up to $100,000 in prizes! For a limited time, 
vendors submitting new applications to BlackBerry App World(TM) will have
the opportunity to enter the BlackBerry Developer Challenge. See full prize  
details at: http://p.sf.net/sfu/Challenge
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] https for zimbra

2009-07-07 Thread Andrew Libby

It's not up, we should get it up and running.


--
Enter the BlackBerry Developer Challenge  
This is your chance to win up to $100,000 in prizes! For a limited time, 
vendors submitting new applications to BlackBerry App World(TM) will have 
the opportunity to enter the BlackBerry Developer Challenge. See full prize 
details at: http://p.sf.net/sfu/blackberry
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup faiures, how to troublehoot

2009-03-12 Thread Andrew Libby

Greetings,

So Continuing on the path of diagnosing what's going on here
I monitored the memory consumption of the backup at issue.
It was hovering around 2.6g or ram.  About 4 hours later it
failed. 2.5g seems big to me.  I've read in the archives
that there's extra overhead because the backuppc dump
process forks and doubles the memory consumption.  In the
state the system was in last night, it'd have to get to
about 3.1g to exhaust the memory on the system.

So I've got a few questions?

Is this rate of consumption normal?

If so, about how much ram should I plan on having for a
backup of a system that has 10M files?

If no for question 1, what could be the culprit?  Perhaps a
memory leak in Perl or one of the underlying libraries?

Thanks again for any input you might have.

Andy


-- 

===
Tangeis, LLC
Andrew Libby
ali...@tangeis.com
www.tangeis.com
610-761-1991
===


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup faiures, how to troublehoot

2009-03-11 Thread Andrew Libby
Andrew Libby wrote:
 Les Mikesell wrote:
 Andrew Libby wrote:
 There are a lot of files on the system, and backups take
 quite a while so I upped the timeout to 40 hrs, but
 backups seem to be failing now in under 20 hrs when they
 were failing at the 20 hr mark.

 Any hints or suggestions are greatly apprecaiated.  Other
 info that might be helpful:

 Both client and server are ubuntu 8.04.
 Transport is rsync.  Client has about 6.5 million files.
 One possibility is a corrupted file system. Have you done an fsck 
 recently?  That's also a lot of files to process in one run.  Rsync 
 loads the whole directory tree in RAM and becomes extremely slow if you 
 go into swap.

 
 Hi Les,
 
 So we calculated it and the ram comes to under a gig.  We've
 got 4G of ram.  The system does not seem to swap when
 processing a backup.   fsck also seems to be in good standing.
 
 Thanks for the suggestions,
 
 Andy

Okay, so I've got some more information.  Seems that the
log in /var/lib/backuppc/log called LOG has

2009-03-11 10:12:23 shyamalan: Out of memory!
2009-03-11 10:12:35 Backup failed on shyamalan (Child exited
prematurely)


I thought out of memory errors were to happen on the client,
rather than the server.  I don't know why I would think that
though.

The server here has 4G, and as I said, the ram required for
the 6M files should be under a gig 6512257 * 100 == 621M
unless my math stinks (and it frequently does).

My sar stats clearly show the backuppc server running out
of memory, but swap utilization remains low.

Thanks for any further insight.

Andy

--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup faiures, how to troublehoot

2009-03-11 Thread Andrew Libby


Les Mikesell wrote:
 Andrew Libby wrote:


 I thought out of memory errors were to happen on the client,
 rather than the server.  I don't know why I would think that
 though.
 
 The client sends the directory list to the server, then the server walks 
 though it comparing against the previous full.

Good to know, thanks.

 
 The server here has 4G, and as I said, the ram required for
 the 6M files should be under a gig 6512257 * 100 == 621M
 unless my math stinks (and it frequently does).
 
 Where did you find the '100' factor?  I'd have guessed bigger - and I 
 think some versions of 64 bit perl have bugs that make it consume more 
 memory.  It might work better to stay 32-bit on a 4 gig machine - or add 
 more RAM if you are 64-bit.

My *100 concept cam from the rsync faq:

http://samba.anu.edu.au/rsync/FAQ.html#4

I don't know if the overhead associated with perl is higher
or lower.  I did see a reference in the archives to some
critical parts of the Rsync perl libraries being native
(compile C).  So this might suggest comparible overhead to
that of rsync.

The OS is 32 bit.

 
 My sar stats clearly show the backuppc server running out
 of memory, but swap utilization remains low.
 
 Sar takes snapshots at 10 minute intervals - you might easily miss 
 peaks.  If you are running more than one backup concurrently it might 
 help to cut back.
 

Good point about the 10 minute interval.  And the sar usage
creeps up to 50% until the last check before the backup
fails, then drops to 9%.

Thanks much. This helps a lot.

Andy

-- 

===
Tangeis, LLC
Andrew Libby
ali...@tangeis.com
www.tangeis.com
610-761-1991
===


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backup faiures, how to troublehoot

2009-03-10 Thread Andrew Libby

Greetings,

I've got a problem with a backup that seems to have started
with one failed backup back in January.  Since then each
attempt at a new backup fails.  The log shows

2009-02-24 17:28:55 Saved partial dump 73
2009-02-25 05:02:47 full backup started for directory /;
updating partial #73
2009-02-26 19:30:34 full backup started for directory /;
updating partial #73
2009-02-28 22:22:06 full backup started for directory /
(baseline backup #72)
2009-03-01 19:54:38 Got fatal error during xfer (Child
exited prematurely)
2009-03-01 19:54:44 Backup aborted (Child exited prematurely)
2009-03-01 19:54:46 Saved partial dump 73

I don't seem to be getting any more information
as suggested might be there in this thread from early
Feb 2009:

http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg13380.html

There are a lot of files on the system, and backups take
quite a while so I upped the timeout to 40 hrs, but
backups seem to be failing now in under 20 hrs when they
were failing at the 20 hr mark.

Any hints or suggestions are greatly apprecaiated.  Other
info that might be helpful:

Both client and server are ubuntu 8.04.
Transport is rsync.  Client has about 6.5 million files.

Thanks in advance for any help.

Andy


-- 

===
Tangeis, LLC
Andrew Libby
ali...@tangeis.com
www.tangeis.com
===


--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup faiures, how to troublehoot

2009-03-10 Thread Andrew Libby
Les Mikesell wrote:
 Andrew Libby wrote:
 There are a lot of files on the system, and backups take
 quite a while so I upped the timeout to 40 hrs, but
 backups seem to be failing now in under 20 hrs when they
 were failing at the 20 hr mark.

 Any hints or suggestions are greatly apprecaiated.  Other
 info that might be helpful:

 Both client and server are ubuntu 8.04.
 Transport is rsync.  Client has about 6.5 million files.
 
 One possibility is a corrupted file system. Have you done an fsck 
 recently?  That's also a lot of files to process in one run.  Rsync 
 loads the whole directory tree in RAM and becomes extremely slow if you 
 go into swap.
 

Hi Les,

So we calculated it and the ram comes to under a gig.  We've
got 4G of ram.  The system does not seem to swap when
processing a backup.   fsck also seems to be in good standing.

Thanks for the suggestions,

Andy



-- 

===
Tangeis, LLC
Andrew Libby
ali...@tangeis.com
www.tangeis.com
610-761-1991
===


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc quotas

2009-01-07 Thread Andrew Libby


cedric briner wrote:
 hello,
 
 aahhaah the world is not infinite and my resources (hardware/brain)
 neither  :(
 
 Can we implement such feature
 
 I'm thinking to put quota on my user, telling them how much data they
 can backup. To do this, the backup will proceed like:
 
1 - first do a disk usage (DU) with the exclude path
2 - if not bigger than the quota, start the backup.
3 - else
3.1 - send an email to the person and ask it to remove some folder or
 some file extension to make his saving data smaller
3.2  - the user launch a DU java WebStart application (GUI) to help
 him to easily calculate the size of what backuppc lets him to backup
3.3 - when done, the DU application talk with backuppc to let know it
 about the new configuration and the new size of the data to be saved
3.4 - backuppc start the backup
 
 
 The java application should do something like jDiskReport or JDU or ...
- calculate the DU and display it
- able to see the tree to backup
- set/unset directories to exclude from.
- display a small panel of file extensions to remove (.jpg, .mp3 ...)
- able to provide as line command options the exclude directories and
 the file extensions directories)
- talk back to the backuppc server the data (exclude directories,
 exclude file extensions, disk usage)
- launch the application as a GUI when launch in 3.2
- launch the application with no GUI when launch in 1
 
 
 What do you think about such idea.
 
 cEd
 
 

Hi Cedric,

Unless I'm missing something, why wouldn't you implement
quotas on the users data before backups?  Most systems have
this capability already.  It'd be much simpler than trying
get users to prioritize which things they want backed up.

Andy


-- 

===
xforty technologies
Andrew Libby
ali...@xforty.com
www.xforty.com
===


--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] scheduling question

2008-12-16 Thread Andrew Libby

Awesome, thanks so much!

Andy


Holger Parplies wrote:
 Hi,
 
 Andrew Libby wrote on 2008-12-15 12:16:49 -0500 [Re: [BackupPC-users] 
 scheduling question]:
 Rob Owens wrote:
 Nick Smith wrote:
 I have some clients that take 3 days or better to do a full backup, i
 would like to do 1 full backup a month, and 1 incr backup every day,
 and keep 1 week worth of incr and only 1 full backup.
 
 for one week every month you will have 2 full backups, because the older
 incrementals depend on the older full backup. That is usually not a problem
 due to the savings you get from pooling. If it *is* a problem, that would
 indicate that you have large amounts of changing data. In this case, you might
 want to rethink your incremental backup strategy, because every (level 1)
 incremental transfers all changes since its reference backup (the full).
 
 I have long running full backups as well.  I'm wondering if
 it's possible to have fulls only done on weekends, but
 incremental happen daily?
 
 Yes. http://backuppc.wiki.sourceforge.net/Schedule_Backups_With_Cron ... you
 should probably set FullPeriod to a large enough value (14.1, 28.1, or even
 365) that automatic scheduling will not interfere, even if your cron job is
 skipped or does not result in a successful backup for some reason.
 
 Regards,
 Holger
 

-- 

===
Tangeis, LLC
Andrew Libby
ali...@tangeis.com
www.tangeis.com
610-761-1991
===


--
SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada.
The future of the web can't happen without you.  Join us at MIX09 to help
pave the way to the Next Web now. Learn more and register at
http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC on solaris or linux

2008-11-24 Thread Andrew Libby

Greetings,

I'm contemplating running BackupPC on Solaris 10.
Is this advisable?  If so are there any packages
available, or am I installing manually?  I saw in
the documentation that it's a manual install.
I guess I'm hoping that someone has built packages.

Is running on Solaris a good or bad idea?

Thanks in advance.

Andy
-- 

===
xforty technologies
Andrew Libby
[EMAIL PROTECTED]
www.xforty.com
610-761-1991
===


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC DR Replication

2008-11-19 Thread Andrew Libby

Greetings,

I've been running BackupPC for a few months.   I'm now
thinking of disaster recover strategy.  My initial thought
is to replicate (via rsync) the BackupPC configs and data to
another server.  This server would not actively run BackupPC
jobs, but I would like to be able to restore from it in the
even to of a catastrophe.

Is this a reasonable approach?  Are others doing something
like this, or something else even to meet the same objectives?

Thanks.

Andy
-- 

===
xforty technologies
Andrew Libby
[EMAIL PROTECTED]
www.xforty.com
610-761-1991
===


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/