Re: [BackupPC-users] BackupPC misconfiguration Rsync network usage

2009-01-07 Thread Vasan
William,

Is the guest machine multi homed - with multiple network interface cards.
Linux binds a IP Address to entire OS rather than to a specific
interface unlike some of the other UNIX flavors that binds it only to the
interface. If it is really multi-homed, you might get a clue by looking at
the ifconfig output of all eth? interfaces. There might be another eth?
interface that would have a corresponding increase of packets that you are
expecting...

HTH

Vasan

On Wed, Jan 7, 2009 at 12:56 PM, William McKee will...@knowmad.com wrote:

 Hi all,

 This evening I tracked down a configuration error that was causing a
 bandwidth spike due to a misconfiguration of BackupPC (v2.1.2). I set
 the IncrPeriod to 0.00 thinking that no incrementals would get run. Boy
 was that wrong! Instead, it ran incrementals one after another during
 off-peak hours. That spiked my bandwidth with my hosting provider which
 sent me searching for the culprit.

 Because of the holidays, I had forgotten about the edit of the
 IncrPeriod so wasn't sure what was causing the spike. Thus I went
 digging through my logs and such to try to identify the culprit.

 I use VMware on a co-lo server which has 3 guestts that all get backed
 up by BackupPC. I could identify that the host was transmitting massive
 amounts of data (130Gb) which appeared to be coming from one of the
 three guests. However, I couldn't figure out which guest was pushing out
 the excessive data.

 I went through the usual log files without much luck. I then checked the
 ifconfig output which all looked normal inside the hosts. Once I finally
 looked at the BackupPC logs for the guest server, I realized what was
 happening and corrected the issue by removing my bad entry. I also added
 --bwlimit to the RsyncArgs setting in config.pl to control maxing out my
 bandwidth.

 However, this all took longer than I'd have liked. I'm stumped as to why
 the data transmitted off of the guest did not show up in the ifconfig
 output. I know that the guest is sending data via rsync based on the
 logs. However it's not showing up in the ifconfig stats (see below). Is
 this due to the way that rsync works? I was sending about 450Mb of data
 every 1-2 hrs from 8pm - 6am (I can send the logs if that would be of
 any help). I've included below the ifconfig outputs for the host
 (massive TX bytes) and the guest (normal TX bytes). I would have
 expected a corresponding amount of TX bytes for the guest. Thanks for
 any insight.


 Cheers,
 William



 Output of ifconfig on host (atlas)
 eth0  Link encap:Ethernet  HWaddr 00:17:A4:3F:C3:B5
  inet addr:64.132.42.194  Bcast:64.132.42.207  Mask:255.255.255.240
  inet6 addr: fe80::217:a4ff:fe3f:c3b5/64 Scope:Link
  UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
  RX packets:67509721 errors:0 dropped:0 overruns:0 frame:3
  TX packets:102403892 errors:0 dropped:0 overruns:0 carrier:0
  collisions:0 txqueuelen:1000
  RX bytes:6915124969 (6.4 GiB)  TX bytes:139582421865 (129.9 GiB)
  Interrupt:16


 Output of ifconfig on guest (wg75)
 eth0  Link encap:Ethernet  HWaddr 00:0c:29:2a:5f:cd
  inet addr:192.168.233.25  Bcast:192.168.233.255
  Mask:255.255.255.0
  inet6 addr: fe80::20c:29ff:fe2a:5fcd/64 Scope:Link
  UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
  RX packets:26307738 errors:0 dropped:0 overruns:0 frame:0
  TX packets:42720081 errors:0 dropped:0 overruns:0 carrier:0
  collisions:0 txqueuelen:1000
  RX bytes:2111065627 (1.9 GB)  TX bytes:2438535854 (2.2 GB)
  Interrupt:17 Base address:0x1400


 --
 Knowmad Technologies - Open Source Web Solutions
 W: http://www.knowmad.com | E: will...@knowmad.com | P: 704.343.9330


 --
 Check out the new SourceForge.net Marketplace.
 It is the best place to buy or sell services for
 just about anything Open Source.
 http://p.sf.net/sfu/Xq1LFB
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
Check out the new SourceForge.net Marketplace.
It is the best place to buy or sell services for
just about anything Open Source.
http://p.sf.net/sfu/Xq1LFB___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar Backup aborts (Connection Timed out)

2007-05-04 Thread Vasan
If you are using DHCP instead of static IP addressing, it might be
that the connections are getting lost when the IP address comes for
lease renewal from the DHCP server. This might
result in a loss of network connection depending on the time it takes
for the IP address to get renewed?

Are your other hosts also having millions of files? Or is this the only host?

vasan

On 5/4/07, Simon Köstlin [EMAIL PROTECTED] wrote:
 
  Hi,
 
  Simon K?stlin wrote on 30.04.2007 at 12:25:18 [[BackupPC-users] Tar Backup 
  aborts (Connection Timed out)]:
 
  I can see the following error log at the end of this mail created by one
  of my hosts to backup.
 
 
  yes, I can see it too :-).
 
 
  My Timeout is set to $Conf{ClientTimeout} = 576000. Which should be about
  160h, but the backup aborted already at about 41h.
 
 
  You're not hit by the timeout (that would be a SIGALRM) but rather by a
  network error. 41 hours is quite a long backup run. If you were using rsync,
  I'd suggest splitting up your backup by first excluding a large part of it
  and gradually reducing the exclude list. That way you'd get shorter backup
  times and things would be more likely to complete (or at least wouldn't
  restart from scratch). Millions of files might or might not make that a
  bad idea though.
 
  You won't like the suggestion, but you *could* split the share up into
  several shares to also reduce backup run time ...
 

 What do you mean with gradually reducing the exclude list? To exclude
 some files of a share and then define the same share with the backuped
 files before?
 
  Why is tar aborting?
 
 
  It does not look like a tar problem but rather like one of the underlying
  network. What does the network topology look like? Is there a firewall
  between BackupPC server and client host? A DSL link (that gets disconnected
  once every 24 hours by your provider, maybe, perhaps)? A flakey switch?
  Someone shutting down the client (which happens to be a SuSE system and
  therefore shuts down networking before killing processes ...)?
 
  True, the address does not *look* very remote ...
 
 The network topology consists of some switches. I don't think that the
 network is the problem. All other host are in the same network and they
 are backuped up correctly.
 A firewall isn't there, too.
 The server should be always on, nobody is shutting down it. The backup
 failed every time a backup was running, but it failed not at the same
 position.
 
  All other hosts backed up without any problem. It is normal that the
  backup takes so long because there are millions of files to backup.
 
 
  Does that mean that this host normally works? Or only other similar hosts?
  Did it fail just once or repeatedly? What's the exact difference to the
  other hosts? Distributions? Kernel versions?
 
 No, only the other hosts work correctly. It fails repeatedly.
 It is a debian server with Kernel 2.6.10. The other servers are debian
 or ubuntu servers, too. There shouldn't be much difference.
  Regards,
  Holger
 
 

 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] NT_STATUS_NO_SUCH_FILE ??

2007-04-21 Thread Vasan
If you have configured backuppc to use Samba, this might mean either
the file OR the share does not have FULL shared permissions. On the
Windows client, enable Full permissions for the specific share and
retry.

Hope this helps.

Srini

On 4/20/07, Alessandro Ferrari [EMAIL PROTECTED] wrote:


 BackupPC ver. 3.0.0beta3

 Hi!!

 I read into XferLOG this line:

 NT_STATUS_NO_SUCH_FILE opening remote file
 \www\progetti-elle\clericimarmi.it\sound\CHEAP~8A.IDE
 (\www\progetti-elle\clericimarmi.it\sound\)

 So I open \www\progetti-elle\clericimarmi.it\sound
 directory path and I seek CHEAP~8A.IDE file and I found
 Cheap_Ga-Osnoff-8749_hifi.mp3 and
 Cheap_Ga-Osnoff-8749_hifi.mp3:Zone.Identifier:$DATA

 What does it mean?

 Thanks, Alessandro

 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/



-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Backup start time and subsequent backups

2007-04-16 Thread Vasan
Hi all,

We have scheduled backups for every day to be at 8:00PM in the evening
for one server host that has around 300GB in full to be backed up.
However, because of the innovative pooling mechanisms, it is clear
that the entire 300 GB is not getting backed up daily by backuppc as
per our expectations. We have scheduled full backups every alternate
day and incremental backup every day.

However, what we are observing is that backups are not being taken
every day... This is what is happening as per the status table

Backup 0Start time: 8:00PMDay 1
Backup 1Start time: 8:03PMDay 2
Backup 2Start time: 8:01PMDay 4

In between, it is supposed to take a full backup on Day 3 which it has
not taken. After going through the code base of Backuppc, we observe
that the first task that is done on first wakeup schedule is to do the
nightly jobs (pool compression, old backups deletion etc.). Since this
is time based backup configuration, the nightly job is done as the
first task on that time.

Depending on the amount of work that needs to be done, it looks like
that the time taken for the nightly jobs task to complete is varying
every day. In the case of Day 2, it has taken 3 minutes approximately
after which the backup would have started. Hence the backup start time
has been noted as 8:03PM.

For the next backup, it always seems to check for any modifications
during the last 24 hours. If for example, the next days nightly job
takes only a minute to complete (8:01PM), then first it will check if
any backup has been done for the last 24 hours at 8:01PM itself. In
this scenario, it will come out that the backup has been done in the
last 24 hours since the  last backup time was 8:03PM which is less
than 24 hours of the current time. Because of this, it skips this day
and goes on to the next day's backup.

This is turning out to be an serious issue as this results in a loss
of one day's backup completely. Please throw some light on the
veracity of our observations and advice on how this issue can be
resolved.

Thanks in advance for your help and inputs on this issue,

Srini

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Blackout period impact on currently running backups...

2007-04-10 Thread Vasan
Hi,

We would like to know the impact of the blackout period on currently
running backups. Let us say that there has been a backup of four
shares (using tar method) configured to start at around 10:00PM on a
particular day. If the backup of first mount points / shares gets
completed by let us say 01:15AM the next day and the black out is
configured from 01:00AM on that day, will the backups of the remaining
three mount points will be started or will it be silently ignored
because the black out period has already come into effect?

Please clarify this situation.

FYI - The full backup #7 in the below mentioned sequence - even though
the backup is said that it is complete, we are not able to see the
backup contents in the status tool of BackupPC. In the same server and
for this host, blackout period had been configured from 1:00AM on that
particular day.

Thanks in advance for your help,

Srini.

The relevant portion of the logfile is as follows:

-- Start of extract of log file --

2007-04-03 13:22:44 incr backup started back to 2007-03-31 19:30:01
for directory /home4
2007-04-03 13:34:20 incr backup 2 complete, 1 files,  bytes, 20466
xferErrs ( bad files,  bad shares, 20466 other)
2007-04-03 15:40:51 full backup started for directory /home1
2007-04-03 18:05:56 full backup started for directory /home2
2007-04-03 20:37:06 full backup started for directory /home3
2007-04-03 22:47:39 full backup 3 complete, 6 files,  bytes, 737073
xferErrs ( bad files,  bad shares, 737073 other)
2007-04-04 09:57:22 Running:
/volumes/alcatel/backupsoftware/bin/BackupPC_tarCreate -h ac10 -n 3 -s
/home3 -t -r / -p /home5/restore/ /selva

2007-04-04 10:43:03 Running:
/volumes/alcatel/backupsoftware/bin/BackupPC_tarCreate -h ac10 -n 3 -s
/home3 -t -r / -p /restore/ /selva

2007-04-04 17:56:00 incr backup started back to 2007-04-03 14:40:51
for directory /home1
2007-04-04 18:15:59 incr backup started back to 2007-04-03 14:40:51
for directory /home2
2007-04-04 18:31:11 incr backup started back to 2007-04-03 14:40:51
for directory /home3
2007-04-04 18:41:41 incr backup started back to 2007-04-03 14:40:51
for directory /home4
2007-04-04 18:51:43 incr backup 4 complete, 4 files,  bytes, 115642
xferErrs ( bad files,  bad shares, 115642 other)
2007-04-05 16:41:04 full backup started for directory /home1
2007-04-05 19:11:48 full backup started for directory /home2
2007-04-05 21:37:35 full backup started for directory /home3
2007-04-05 23:46:53 full backup started for directory /home4
2007-04-06 02:04:51 full backup 5 complete, 8 files,  bytes, 988970
xferErrs ( bad files,  bad shares, 988970 other)
2007-04-06 20:31:06 incr backup started back to 2007-04-05 15:41:04
for directory /home1
2007-04-06 20:47:48 incr backup started back to 2007-04-05 15:41:04
for directory /home2
2007-04-06 21:00:16 incr backup started back to 2007-04-05 15:41:04
for directory /home3
2007-04-06 21:09:18 incr backup started back to 2007-04-05 15:41:04
for directory /home4
2007-04-06 21:18:25 incr backup 6 complete, 4 files,  bytes, 114761
xferErrs ( bad files,  bad shares, 114761 other)
2007-04-06 21:18:25 removing incr backup 2
2007-04-07 20:42:22 full backup started for directory /home1
2007-04-07 23:14:16 full backup started for directory /home2
2007-04-08 01:39:27 full backup started for directory /home3
2007-04-08 03:51:00 full backup started for directory /home4
2007-04-08 06:09:07 full backup 7 complete, 8 files,  bytes, 989096
xferErrs ( bad files,  bad shares, 989096 other)
2007-04-08 06:09:07 removing full backup 0

- End of extract of log file --

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/