Re: [BackupPC-users] host resolution problem

2011-03-20 Thread Travis Fraser
On Sun, 2011-03-20 at 18:06 +0100, Gabriel Rossetti wrote:

 Hi Tamas,
 
 not really, I must not have explained myself correctly. I have a local 
 network with a server that backs up our laptops. The laptops use DHCP to 
 get their IPs, they are Macs and Linux laptops. For my example I will 
 use a notebook called myNotebook. It can't be resoved using DNS 
 resolution, but nmblookup can find it. 

If you control the DHCP server, why not assign a known IP address to
each of the laptops? Then use either DNS, or put the laptop entries into
the hosts file on the backuppc server.

-- 
Travis Fraser tra...@snowpatch.net


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using NFS share as BackupPC pool

2007-12-04 Thread Travis Fraser
On Mon, 03 Dec 2007 14:13:19 +0100
Dan S. Hemsø Rasmussen [EMAIL PROTECTED] wrote:

 Hi...
 
 Anyone tried to use a NFS mount as pool for the backups...?
 I would like to use a Synology NAS box as storage for my BackupPC 
 server. But will it work.
 
I use a NAS box mounted via NFS. It works fine. The underlying
filesystem on the NAS must support hardlinks I would think. Does the
Synology box use Linux for the OS?


Travis Fraser

-
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Can't get excludes to work

2007-10-27 Thread Travis Fraser
On Sat, 2007-10-27 at 10:49 -0400, Arch Willingham wrote:
 When all else fails, try em' all! I don't know which part fixed it but this 
 ended up working:
 
Maybe try eliminating them one at a time to see? Also, what transfer
method are you using?
 
 $Conf{BackupFilesExclude} = {
   '\\backup\\*' = [
 ''
   ],
   '/backup/*' = [
 ''
   ],
   '*' = [
 '\\backup\\*',
 '/backup/*'
   ]
 };

-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Can't get excludes to work

2007-10-26 Thread Travis Fraser
On Fri, 2007-10-26 at 16:53 +0200, Toni Van Remortel wrote:
 Arch Willingham wrote:
  Even though the slashes go the other way in Windows
 Yes. It's a Unix system that is taking the backups, so you need to use 
 the Unix way to address directories. So / is the separator, \ is just an 
 escape character.
 
In playing around with excludes a while back, I found it depends on what
the transfer method is. For smb, the backslash works for excludes. For
rsync, the forward slash.
-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Logwatch rule

2007-10-02 Thread Travis Fraser
On Tue, 2007-10-02 at 16:22 -0400, Richard Bailey wrote:
 I just wanted to get some daily status on BackupPC, such as the size
 and usage of the pool/filesystem, the host status and so on. I don't
 really want to get a new email after each and every job so integrating
 this into logwatch seemed like the idea thing. 
 
 I saw BackupPC_serverMesg and ran it with status info, status jobs and
 status hosts but the output is not easily readable, is there something
 to convert that output to something that can easily go into a logwatch
 email? 
 
I had written a script for version 2.1.1 that sends me a daily status
email as well as generates an RSS feed. It is in the list archives, but
if you want I can email it to you.
-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Error starting a backup

2007-07-17 Thread Travis Fraser
On Tue, 2007-07-17 at 22:35 -0400, Yaakov Chaikin wrote:
 Hi,
 
 I am running BackupPC 3.0 and having trouble getting it to back up. So
 far, I am just running a test back up on the machine that hosts the
 BackupPC.
 
 Here is the error log that I am getting when I click Start Full Backup:
 **
 2007-07-17 22:26:26 User yaakov requested backup of tbiqdev (tbiqdev)
 2007-07-17 22:26:26 tbiqdev: mkdir /data/BackupPC: Permission denied
 at //usr/share/BackupPC/bin/BackupPC_dump line 193
 **
 
 The /data/BackupPC is directory where I specified I wanted the backups
 to be stored. It's owned by the apache user since this is the user
 that runs the 'httpd' server.
Is apache the user that runs backuppc?
-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up excluded directories

2007-05-11 Thread Travis Fraser
On Fri, 2007-05-11 at 15:22 -0300, Miles Thompson wrote:
 James,
 
 Thank you for the suggestion, but it did not work. The script never
 enters Program Files or Windows, but has no such hesitation about
 My Music.
 
 I have had another thought - could it be continuing to visit this
 directory because it was included in the original full backup?
 
What does the smb share sarah refer to? My guess is a user's directory
under Documents and Settings. If so, that is why two of the excludes
appear to work. The third exclude will most likely work with a backslash
instead of the forward slash e.g. '\My Music' or '\My\ Music'.

 James wrote:
  you may need to add '/My\ Music' or '/My Music' to your list of
  directories to exclude, I can't remember at the moment if you need to
  escape the space (using the backslash) or whether '/My Music' will
  work, one of those two should work though, I would guess that this is
  happening because samba accesses the folders with their full names
  rather than their msdos shortened names
 
  let me know if that works for you
 
  On 5/11/07, *Miles Thompson* [EMAIL PROTECTED]
  mailto:[EMAIL PROTECTED] wrote:
 
  I have figured out most of BackupPC, but on some machine, directories
  excluded in their configuration files are still being backed up.
 
  From target machine's configuration file:
  $Conf{SmbShareName} = 'sarah';
  $Conf{BackupFilesExclude} = ['/MYMUSI~1','/WINDOWS','PROGRA~1'];
 

-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] excludedir problems

2007-04-27 Thread Travis Fraser
On Wed, 2007-04-25 at 19:04 +0200, vladimir domjan wrote:
 I know this was possibly answer before but, sourceforge mailing list
 search is another thing... :)
 
 I use version 3.0xx on debian.
 No errors. Using excludefolder option gives me headache. 
 
 example:
 
 /etc/backuppc/Workstation1.pl
 Code:
 $Conf{SmbShareName} = ['backuptest'];
 $Conf{SmbSharePasswd} = 'username'; 
 $Conf{SmbShareUserName} = 'password';
 $Conf{BackupFilesExclude} = {'/user3'};

Use brackets instead of curly braces in your exclude. Use the curly
braces for defining excludes for multiple shares. 
 
 I used forward and back slashes with no luck. /user3 folder is allways
 backed up. This is used in test controlled enviroment, backup meets
 our needs.

For smb excludes, I have only gotten backslashes to work.

-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] vmware windows problem

2007-04-15 Thread Travis Fraser
On Sun, 2007-04-15 at 10:01 -0400, David Relson wrote:
 On my gentoo linux workstation I have vmware-server running WinXP home
 as a guest operating system.  I'm working to configure BackupPC to save
 WinXP's files.
 
 The WinXP virtual machine is named winxp-vm and C: is shared as WINXP-C.
 
 In /etc/hosts is:
 
 hostdhcpusermoreUsers
 winxp-vm0   backuppc
 
 and in /etc/hosts/pc/winxp-vm.pl is:
 
 $Conf{SmbShareName} = 'WINXP-C';
 $Conf{SmbShareUserName} = 'backuppc';
 $Conf{SmbSharePasswd} = ;
 $Conf{XferMethod} = 'smb';
 
 At present backups are incomplete :-
 
 Looking at /archive/pc/winxp-vm/0/fWINXP-C/ the following directories
 are present:
 
 fcygwin
 fDocuments and Settings
 f.emacs.d
 fProgram Files
 fRECYCLER
 fSystem Volume Information
 fWINDOWS
 
 Of these directories fcygwin looks good, but the rest are pretty much
 empty:
 
 fDocuments\ and\ Settings/fAll\ Users seems OK
 fProgram\ Files has _no_ files
 fDocuments\ and\ Settings/fDavid\ Relson has _no_ files
 the other directories are also empty.
 
 Any thoughts on what I've done wrong?
 
Does the backuppc user on the winxp machine have permissions to access
everything you want to backup?
-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] vmware windows problem

2007-04-15 Thread Travis Fraser
On Sun, 2007-04-15 at 12:56 -0400, David Relson wrote:
 On Sun, 15 Apr 2007 10:16:55 -0400
 Travis Fraser wrote:
 
 ...[snip]...
 
   Any thoughts on what I've done wrong?
   
  Does the backuppc user on the winxp machine have permissions to access
  everything you want to backup?
  -- 
  Travis Fraser [EMAIL PROTECTED]
 
 Hi Travis,
 
 An interesting question...
 
 As mentioned, I've set C:\ to be shareable and, since BackupPC has
 backed up some of the top level directories this is OK.
 
 However since Program Files files isn't being backed up, its sharing is
 set wrong.  The General Properties tab for Program Files have
 read-only checked and the Advanced Properties tab includes the statement
 
All options on this tab are disabled because this folder is used
by the operating system.
 
 So the permissions appear to be unchangeable.  The question becomes,
 How does BackupPC _ever_ deal with Program Files?
 
Why do you want to back up the programs? I just back up files in the
Documents and Settings directory, and if the whole machine croaks, I
either reinstall from original media, or from images made with
partimage.

 With regards to user directories such as My Documents, one _could_
 make the directory sharable.  This doesn't seem to be the right thing
 to do.
 
 Are you using BackupPC with windows?  Are you using smb or another
 access method, i.e. rsync?
 
I have used both smb and rsyncd for windowsXP machines. For smb, I added
the backuppc user as a windows user that is also in the existing Backup
Operators group. For the share, I then add to the Share Permissions (the
Permissions button on the Sharing tab of the share properties) the
Backup Operators group. Also, under the properties of a directory that
is denying access, look at the Security tab for the directory and add
the Backup Operators group.

On another machine, I used the rsyncd with VSS (look at the mailing list
archives), and it was very straightforward.
-- 
Travis Fraser [EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync 2.6.9 issues

2007-04-10 Thread Travis Fraser
On Tue, 2007-04-10 at 11:06 -0700, Kris S. Amundson wrote:
 Greetings archivers,
 
 I've been running BackupPC for quite some time and have been very happy.
  Today I ran into an issue.  I added two new hosts, set them up like I
 always have (rsync using SSH keys), and was getting failures.
 
 Server is 2.1.1 on Sarge.
 Client hosts are a mix of sarge/dapper.
 Two new hosts are pre-release etch (March builds).
 
 I could ssh and run rsync just fine: `ssh [EMAIL PROTECTED] sudo rsync`
 
 The command BackupPC runs:
 /usr/bin/ssh -q -l backuppc amg1 sudo /usr/bin/rsync --server --sender
 --numeric-ids --perms --owner --group --devices --links --times
 --block-size=2048 --recursive --exclude=/proc --exclude=/sys
 --exclude=/tmp --ignore-times . /
 
 The error:
 2007-04-10 09:59:06 full backup started for directory /
 2007-04-10 09:59:08 Got fatal error during xfer (fileListReceive failed)
 
 The workaround:
 After pondering what could be different, the only thing that came to
 mind was etch itself.. so possibly a version or bug.  I removed the
 version of rsync from etch and installed the sarge package.
 
 etch: rsync  version 2.6.9  protocol version 29
 sarge: rsync  version 2.6.4  protocol version 29
 
 Everything is working now that I've downgraded rsync.  Anyone else hit
 this issue?  I'll try again when I build a Debian 4.0 box.
 
Almost exactly a year ago this was a popular question. The archives can
explain it well, but in summary, replace --devices in the rsync command
with -D.
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] More thoughts on Laptop backups

2007-03-04 Thread Travis Fraser
On Sun, 2007-03-04 at 14:18 -0500, Rick DeNatale wrote:
 Someone recently brought up some questions about securely backing up
 laptops with backuppc, and it prompted me to ask about some questions
 which I've been pondering.
 
 I'm currently backing up several systems including a laptop which
 happens to be running ubuntu linux.
 
 Now one thing I've never properly addressed is that this laptop has
 both an internal ethernet card, and a pcmcia wi-fi card.  Most times I
 use the wireless, but sometimes plug it into the wall instead.
 
 The issue is ensuring that I'm properly identifying this laptop.  I
 really don't want to use dynamic dhcp and ddns, so I've got dhcp to
 assign fixed ip addresses based on mac addresses.  This means that the
 laptop gets one or another ip address depending on whether it's using
 a wireless or wired connection, and it has different dns names
 depending on its ip address.
 
 Right now, I've got backuppc configured to look for it only using the
 wireless ip addr/name.
 
 Is there a way to configure backuppc to find it through either interface.
 
You could make two hosts for it, one corresponding to each ip address,
and that would not take up additional space due to pooling.

 Related to this is how to actually ensure that I'm backing up this
 particular laptop.  Imagine that someone visited and borrowed my wi-fi
 card, the visitor's laptop would look like mine to backuppc which
 wouldn't be good.  Is there a way to associated a backup with a given
 machine and not just the mac address?
 
Use rsync over ssh with a public key. The visitor's laptop will not have
the key.
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] More thoughts on Laptop backups

2007-03-04 Thread Travis Fraser
On Sun, 2007-03-04 at 14:18 -0500, Rick DeNatale wrote:
 Someone recently brought up some questions about securely backing up
 laptops with backuppc, and it prompted me to ask about some questions
 which I've been pondering.
 
 I'm currently backing up several systems including a laptop which
 happens to be running ubuntu linux.
 
 Now one thing I've never properly addressed is that this laptop has
 both an internal ethernet card, and a pcmcia wi-fi card.  Most times I
 use the wireless, but sometimes plug it into the wall instead.
 
 The issue is ensuring that I'm properly identifying this laptop.  I
 really don't want to use dynamic dhcp and ddns, so I've got dhcp to
 assign fixed ip addresses based on mac addresses.  This means that the
 laptop gets one or another ip address depending on whether it's using
 a wireless or wired connection, and it has different dns names
 depending on its ip address.
 
 Right now, I've got backuppc configured to look for it only using the
 wireless ip addr/name.
 
 Is there a way to configure backuppc to find it through either interface.
 
Why not assign the same hostname and IP address to either MAC address
(assuming your wireless network is bridged to the wired)?
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] syntax for excluded files

2007-02-18 Thread Travis Fraser
On Sun, 2007-02-18 at 09:15 -0600, Seb wrote:
 Hi,
 
 I'm unable to find instructions in /etc/backuppc/config.pl showing whether
 the files to exclude are relative to the included files or not.  I'm not
 sure if I say:
 
 
 $Conf{BackupFilesExclude} = ['/some-path', '/another-path];
 
 
 having:
 
 
 $Conf{BackupFilesOnly} = '/home/';
 
 
 the excluded files will be /home/some-path and /home/another-path, or the
 absolute /some-path and /another-path.  I would assume the former, but
 can't be sure.  Any advice would be appreciated.
 
I think it depends on the transport method. You can have
$Conf{BackupFilesExclude} _or_ $Conf{BackupFilesOnly} but not both for
smb. Excludes are relative to the share name. Look at the docs.
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC exclusion per-share using arrays in $Conf{BackupFilesExclude}

2007-01-30 Thread Travis Fraser
On Tue, 2007-01-30 at 17:16 -0600, John Buttery wrote:
   I'm wondering if I've got the syntax right for this stuff...it looks 
 right, but I can't find any actual examples anywhere and some of the 
 files that I think should be excluded by this, aren't.
 
   This is BackupPC 2.1.1 as installed by Debian Sarge (the full package 
 version string is '2.1.1-2sarge2').
 
   When excluding directories, do I need to put a trailing '/' character?  
 Are wildcards allowed?  When specifying directory names inside the 
 per-share array, do I put the leading '/' characters?
 
   Also, this backup fails with the following error (the other 7 or 8 
 hosts we back up are fine)...but I think this may be due to its trying 
 to back up files that are getting created/destroyed or modified during 
 the run.  That problem should solve itself once the files start getting 
 excluded properly, I'm guessing.
 
 The following hosts had an error that is probably caused by a
 misconfiguration.  Please fix these hosts:
   - hostname (aborted by signal=ALRM)
 
   Here are the lines from the main config.pl that affect what's being 
 backed up (as far as I know...I can send other stuff as required, 
 didn't want to spam the list with the entire file :p).
 
 - cut here
 $Conf{XferMethod} = 'rsync';
 $Conf{RsyncShareName} = ['/','/usr','/var','/home'];
 - cut here
 
   And, here are the entire contents (excluding comments) of the 
 host-specific config file for the machine in question:
 
 - cut here
 $Conf{BackupFilesExclude} = {
 '/' = [
 '/dev',
 '/tmp',
 '/mnt',
 '/floppy',
 '/cdrom',
 '/proc'
 ],
 
 '/var' = [
 '/log/maillog',
 '/lib/mysql/data',
 '/lib/mysql/mysql',
 '/log/mysql/mysql.log',
 '/log/mysql/mysql-bin*',
 '/spool/exim4'
 ],
 };
 - cut here
Would you be running into problems because the other rsync share names
 are included in /?
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Data Directory on a Network Attached Storage

2007-01-25 Thread Travis Fraser
On Thu, 2007-01-25 at 13:09 -0700, Brien Dieterle wrote:
 Most NFS servers are pitifully slow compared to a local filesystem,
 particularly when dealing with many small files.  It pains me to think
 about how slow that might get-- is anyone else using a non-local
 filesystem?
I use a Linux-based NAS device that is mounted via NFS. The backups are
not huge (~25GB), but the performance seems fine. The BackupPC server
does have a dedicated connection with a crossover cable to the NAS.

 Simon Köstlin wrote: 
  I think TCP is a safer connection or plays that none rolls?
  Also when I click on a PC in the web interface it takes around 20-30 seconds
  until the web page appears with the backups which were made. I thought that
  would be better with an other connection. But that time is not dependent on
  the size of the backups. I made backups with just some files and it takes
  that time to load also if I have Backups with 3GB.
  
  -Ursprüngliche Nachricht-
  Von: Les Mikesell [mailto:[EMAIL PROTECTED] 
  Gesendet: Donnerstag, 25. Januar 2007 20:35
  An: Simon Köstlin
  Cc: backuppc-users@lists.sourceforge.net
  Betreff: Re: [BackupPC-users] BackupPC Data Directory on a Network Attached
  Storage
  
  Simon Köstlin wrote:

   
   I want to have the Data Directory on a Network Attached Storage (NAS) 
   and not on the BackupPC Server. The NAS supports NFS, SMB, FTP, CIFS 
   and SSH. I tried to mount an NFS Share on the NAS and that works well. 
   So I can use the Data Directory in this Share. But the NAS supports 
   only a UDP connection with NFS. Are there any other solutions to use a 
   TCP connection? I tried also SMB, but that did not work with BackupPC. 
   I mounted a Share with SMB and that works, but when I wanted to start 
   BackupPC, BackupPC did not start. I only found a log directory on the 
   Share with the LOG file in which was an error like bind() failed. Does 
   anybody know why SMB does not work with BackupPC? Or are there any 
   other solutions like to mount a FTP connection?

-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email reports

2007-01-15 Thread Travis Fraser
On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:
 
 Hi, 
 
 Is it possible that Backuppc sends weekly (or daily, dont mind) reports to a
 specific email even though there are no errors with the backups?. I want
 something like that, but I dont know how to do it.
 
I have a daily status email and RSS feed. You can edit out the RSS stuff
if you want just email. The status email address is configurable. I
wrote up a little howto for myself (I am running backuppc-2.1.1, so
adjustments may be necessary if using a newer version):

BackupPC RSS feed and email status HOWTO
---

1. I created a script[see step 5] called BackupPC_statusUpdate modeled
on BackupPC_sendEmail. The script parses the backup status of each host,
creates an RSS feed and also sends the information by email.
BackupPC_statusUpdate resides in $BinDir (/usr/lib/backuppc/bin/ in my
case) and runs once each night.


2. Added $Conf{EMailStatusUserName} to the main config
file /var/lib/backuppc/conf/config.pl for email address(es) to receive
nightly status emails:

 #
 $Conf{EMailFromUserName} = 'backuppc';

+#
+# Destination address for daily positive status email.
+#
+$Conf{EMailStatusUserName} = '[EMAIL PROTECTED]';

 #
 # Destination address to an administrative user who will receive a
 


3. Added a call to BackupPC_statusUpdate in BackupPC_nightly (note the
addition of the semicolon on the first system command below):

 if ( $opts{m} ) {
 print(log BackupPC_nightly now running BackupPC_sendEmail\n);
!system($BinDir/BackupPC_sendEmail);

+# RSS and positive status email
+#
+print(log BackupPC_nightly now running BackupPC_statusUpdate\n);
+system($BinDir/BackupPC_statusUpdate);
 }


4. Added header (to advertise feed to RSS readers e.g. Firefox) on my
backup server documentation webpage (this can be any spot viewable from
your intranet) at  /var/www/localhost/htdocs/index.html. This is an
optional step. The link path is the place in the webroot that the main
script writes the xml file.

 /style

+link rel=alternate type=application/rss+xml
+   href=backuppc/backuppc_status.xml title=BackupPC RSS feed

 /head


5. BackupPC_statusUpdate

#!/usr/bin/perl
#=
-*-perl-*-
#
# BackupPC_statusUpdate
#
# DESCRIPTION
#
#   This script implements a positive status email and an RSS feed.
#
#   The script is called from BackupPC_nightly.
#
# AUTHOR
#   Travis Fraser [EMAIL PROTECTED]
#
# Credit to Rich Duzenbury for the original idea.
#
#
# Requires XML::RSS
#
# Edit the variable $serverName to suit depending on DNS status on your
# network.
# Edit the use lib ... in the 3rd line of code below.
# Edit the $base_url in the RSS section to reflect the correct path to
# the cgi page.
# Edit the $rss-save ... line near the end of the script to suit.
#
#

use strict;
no  utf8;
#
# The lib path needs to match that in the stock backuppc files.
#
use lib /usr/lib/backuppc/lib;
use BackupPC::Lib;
use XML::RSS;

use Data::Dumper;
use Getopt::Std;
use DirHandle ();
use vars qw($Lang $TopDir $BinDir %Conf);

#
# Variables
#
my($fullTot, $fullSizeTot, $incrTot, $incrSizeTot, $str, $mesg,
   $strNone, $strGood, $hostCntGood, $hostCntNone);
$hostCntGood = $hostCntNone = 0;

my $serverName = '192.168.1.3';

#
# Initialize
#
die(BackupPC::Lib-new failed\n) if ( !(my $bpc =
BackupPC::Lib-new) );
$TopDir = $bpc-TopDir();
$BinDir = $bpc-BinDir();
%Conf   = $bpc-Conf();
$Lang   = $bpc-Lang();

$bpc-ChildInit();

my $err = $bpc-ServerConnect($Conf{ServerHost}, $Conf{ServerPort});
if ( $err ) {
print(Can't connect to server ($err)\n);
exit(1);
}
#
# Retrieve status of hosts
#
my $reply = $bpc-ServerMesg(status hosts);
$reply = $1 if ( $reply =~ /(.*)/s );
my(%Status, %Info, %Jobs, @BgQueue, @UserQueue, @CmdQueue);
eval($reply);
#
# Ignore status related to admin and trash jobs
foreach my $host ( grep(/admin/, keys(%Status)) ) {
delete($Status{$host}) if ( $bpc-isAdminJob($host) );
}
delete($Status{$bpc-trashJob});

#
# Set up RSS feed
#
my $now = $bpc-timeStamp(time);

#
# The cgi page in this case is over HTTPS
#
my $base_url = 'https://' . $serverName . '/cgi-bin/BackupPC_Admin';

my $rss = new XML::RSS (version = '2.0', encoding = 'ISO-8859-1

Re: [BackupPC-users] Email reports

2007-01-15 Thread Travis Fraser
On Tue, 2007-01-16 at 08:24 +1100, Les Stott wrote:
 Travis Fraser wrote: 
  On Thu, 2007-01-11 at 10:30 +0100, Francisco Daniel Igual Peña wrote:

   Hi, 
   
   Is it possible that Backuppc sends weekly (or daily, dont mind) reports 
   to a
   specific email even though there are no errors with the backups?. I want
   something like that, but I dont know how to do it.
   
   
  I have a daily status email and RSS feed. You can edit out the RSS stuff
  if you want just email. The status email address is configurable. I
  wrote up a little howto for myself (I am running backuppc-2.1.1, so
  adjustments may be necessary if using a newer version):

 Travis, 
 
 that looks like some fantastic work!! Certainly something that i think
 is worthwhile. I only wish i knew perl so i could knock out stuff like
 this ;)
 
 I haven't tested myself, but if you have a sample email/rss output
 could you post so we can see what it looks like?
A typical email looks like so (for an RSS screenshot, I can email that
later):

To: 
[EMAIL PROTECTED]
Subject: BackupPC status: 4 hosts with good backups
  Date: 
Mon, 15 Jan 2007 01:01:01 -0500

Host: crescent
Full Count: 2Full age/days: 435.6
Full Size/GB: 0.15   Speed MB/sec: 3.03
Incremental Count: 0 Incremental Age/Days: 
State: idle  Last Attempt: nothing to do

Host: marmolata
Full Count: 9Full age/days: 3.2
Full Size/GB: 0.59   Speed MB/sec: 3.66
Incremental Count: 6 Incremental Age/Days: 0.2
State: idle  Last Attempt: nothing to do

Host: pigwin
Full Count: 5Full age/days: 53.4
Full Size/GB: 1.28   Speed MB/sec: 6.30
Incremental Count: 1 Incremental Age/Days: 212.3
State: backup starting   Last Attempt: no ping (host not found)

Host: sweetpea
Full Count: 9Full age/days: 3.2
Full Size/GB: 0.94   Speed MB/sec: 1.91
Incremental Count: 6 Incremental Age/Days: 0.2
State: backup starting   Last Attempt: nothing to do

-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] smb scheduled backup fails, but manual works

2006-12-06 Thread Travis Fraser
On Wed, 2006-12-06 at 17:11 -0700, Chris Purves wrote:
 Hi,
 
 I have a problem where a scheduled smb backup does not run.  The home 
 page for that machine shows 'Pings to king-graham have failed 109 
 consecutive times.'
 
 However, I can ping the machine from the command line and I can manually 
 start an incremental backup after which the home page shows 'Pings to 
 king-graham have succeeded 2 consecutive times.'
 
 Any help is appreciated.  Thanks.
 
Is this a laptop where it might go into suspend or leave the network?
-- 
Travis Fraser [EMAIL PROTECTED]


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Nagios plugin

2006-08-22 Thread Travis Fraser
On Mon, 2006-08-21 at 20:30 -0400, Bill Hudacek wrote:
 Seneca Cunningham wrote:
  I've written a Nagios plugin to monitor BackupPC (tested against  
  2.1.2) and will be putting it online soon (it takes some time for  
  sourceforge to approve a project).  I'm doing some cleanup and  
  verbosity adjustment right now, and was wondering if there were any  
  specific features for the plugin beyond what I have that anyone would  
  like.
 
  The plugin currently warns if there are any hosts with errors and all  
  those hosts have a backup newer than an age threshold, and goes  
  critical if the errored hosts have a backup older than the threshold.
 That's a great idea! Put me down for one. 
 
 Basically, having to check the web interface every day or every other 
 day (whenever I remember) is not a big deal, but this is definitely a 
 nice-to-have.  I've just never scripted any kind of cron/anacron checks, 
 so...
 
Not as nice as the Nagios plugin idea, but I have a daily status email
and a RSS feed that gets updated daily (the RSS idea from Rich
Duzenbury). Rich's original RSS patch requires using the cgi interface
to generate the xml, and I wanted something automated, so months back I
wrote a script that is called from backuppc_nightly. If anyone is
interested, let me know.
-- 
Travis Fraser [EMAIL PROTECTED]


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Ubuntu to debian etch

2006-06-22 Thread Travis Fraser
On Thu, 2006-06-22 at 22:07 -0400, don Paolo Benvenuto wrote:
 Hi guys!
 
 I'm trying to use backuppc in order to backup my lan pc's to one of
 them.
 
 In my lan there are 6 ubuntu dapper pc's  (and the backuppc server is a
 dapper too) and a debian etch one.
 
 All is going smoothly backing up the ubuntu pc's.
 
 On the contrary, I can't perform the backup from the debian pc.
 
 I'm using rsync, and after two minutes backing up the backup stops. This
 is the cooresponding XferLOG:
 
 Running: /usr/bin/ssh -q -x -l root misiongenovesa /usr/bin/rsync
 --server --sender --numeric-ids --perms --owner --group --devices
  ^
  |
change the --devices to -D   --

 --links --times --block-size=2048 --recursive --exclude=/proc
 --exclude=/tmp --ignore-times . /
 Xfer PIDs are now 9803
 Rsync command pid is 9803
 Got remote protocol 29
 Checksum seed is 1151027857
 Got checksumSeed 0x449b4a91
 fileListReceive() failed
 Done: 0 files, 0 bytes
 Got fatal error during xfer (fileListReceive failed)
 Backup aborted (fileListReceive failed)
 
 Any hint?
 
 
-- 
Travis Fraser [EMAIL PROTECTED]


Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows client: Unable to read 4 bytes

2006-06-13 Thread Travis Fraser
On Tue, 2006-06-13 at 21:13 +0200, Peter Pfandt wrote:
 One of my windows clients has the cygwin-rsyncd installed and I want to 
 backup 
 the client via rsync. The error messages I receive look like the known ssh 
 authorisation problem, but ssh is not used:
 
 
 Connected to maitreya:873, remote version 28
 Connected to module D
 Sending 
 args: --server --sender --numeric-ids --perms --owner --group --devices 
 --links --times --block-size=2048 --recursive --specials --exclude=\System 
 Volume Information --ignore-times . .
 Read EOF: 
 Tried again: got 0 bytes
 Done: 0 files, 0 bytes
 Got fatal error during xfer (Unable to read 4 bytes)
 Backup aborted (Unable to read 4 bytes)
 --
 
 Any ideas what this means in this context?
Have you tried to do a manual rsync e.g. just listing the files in the
remote module?
-- 
Travis Fraser [EMAIL PROTECTED]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Funny ssh Problems

2006-06-12 Thread Travis Fraser
On Mon, 2006-06-12 at 19:22 +0200, Nils Breunese (Lemonbit Internet)
wrote:
 Randall Barlow wrote:
 
  Sorry for the post - as fate would have it I resolved the issue just
  minutes after posting it.  It turns out that I had only attempted to
  connect to remotehost.remotedomain as I was testing how well the  
  ssh key
  pair worked, but backuppc was just trying to connect to remotehost
  (thus, the name remotehost hadn't been added to the known_hosts and it
  was asking the yes/no question which backuppc wasn't expecting/ 
  shouldn't
  expect :))
 
 I have the same problem (fileListReceive failed):
 
 Running: /usr/bin/ssh -q -x -l root full-hostname /usr/bin/rsync -- 
 server --sender --numeric-ids --perms --owner --group --devices -- 
 links --times --block-size=2048 --recursive --exclude=/proc --ignore- 
 times . /
 Xfer PIDs are now 12189
 Got remote protocol 29
 fileListReceive() failed
 Done: 0 files, 0 bytes
 Got fatal error during xfer (fileListReceive failed)
 Backup aborted (fileListReceive failed)
 
 I'm using the rsync transfer method. However, when I do 'su -  
 backuppc' I can run 'ssh root@full-hostname' just fine and I end up  
 at the prompt of the client machine. Where else do I check what's wrong?
 
What version of rsync are you using? Later versions need --devices
changed to -D in $Conf{RsyncArgs}.

 Nils Breunese.
 
  Randall Barlow wrote:
  OK,
 
 So I have the system backing up my localhost OK, and two other
  windows machines, but I'm having trouble getting it to work with  
  rsync
  over ssh for a remote host.  I am running backuppc as user apache,  
  and
  so I've created a public/private key pair and stored it in
  ~apache/.ssh/id_rsa on the local server, and have added it to the  
  root
  user's authorized keys for the remote server.  Now if I su apache,  
  I am
  allowed to ssh to the remote host without being asked for a password.
  However, when I try to backup the remote machine via the CGI, I  
  get the
  error: fileListReceive failed.  Inspecting the logs:
 
  Fatal error (bad version): Xlib: connection to :0.0 refused by  
  server
  Xlib: Read EOF: Tried again: got 0 bytes
  fileListReceive() failed
  Done: 0 files, 0 bytes
  Got fatal error during xfer (fileListReceive failed)
  Backup aborted (fileListReceive failed)
 
  Has anybody seen this problem before?  It seems like it's trying  
  to open
  up an X window for some reason, any ideas?
 
 
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/
 
-- 
Travis Fraser [EMAIL PROTECTED]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Funny ssh Problems

2006-06-12 Thread Travis Fraser
Nils Breunese (Lemonbit Internet) wrote:
 Travis Fraser wrote:

   
 What version of rsync are you using? Later versions need --devices
 changed to -D in $Conf{RsyncArgs}.
 

 I guess I need to change this for $Conf{RsyncRestoreArgs}  as well?
   
I would think so, although I haven't tested that part yet.

Travis


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Re: rsync with VSS-support

2006-05-24 Thread Travis Fraser
On Wed, 2006-05-24 at 16:18 -0500, Carl Wilhelm Soderstrom wrote:
 I just tried Elias's patched rsyncd-for-windows with VSS support
 (http://users.tkk.fi/~epenttil/rsync-vss/); but I
 can't get it to work on either of the machines I installed it on. (One
 W2K3EE, and a WXP Home).
 
 Even if I just tried to run rsyncd.exe from the command line (with the
 '--daemon --no-detach -vvv' options); it would just exit right away.
 
 Has anyone else gotten it to work?
I got it to work. There is a typo in the rsync.conf file relating to the
name of the folder it is in (Program Files/rsync). I tried changing the
folder name to match the file contents, but no luck. Just change in the
rsync.conf file the locations of the pid file etc.
 
 I do have the full cygwin kit installed on both of the machines in question,
 which I know can cause problems with operation; but I did try moving the
 c:\cygwin directory out of the way (to the Trash, temporarily) without
 success.
 
 The .msi file is very cool tho, works great, even on a win98 box. Ought to
 use that for the regular backuppc-rsyncd package.
 
-- 
Travis Fraser [EMAIL PROTECTED]



---
All the advantages of Linux Managed Hosting--Without the Cost and Risk!
Fully trained technicians. The highest number of Red Hat certifications in
the hosting industry. Fanatical Support. Click to learn more
http://sel.as-us.falkag.net/sel?cmd=lnkkid=107521bid=248729dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Conf{BackupFilesExclude} not working

2006-05-18 Thread Travis Fraser
On Thu, 2006-05-18 at 15:34 -0500, [EMAIL PROTECTED] wrote:
 I have a debian sarge box running backuppc 2.1.1 to back up a bunch of XP
 Pro Desktops and a Win2k3 fileserver. It was working fine at one point. I
 then had to rebuild the RAID5 that contains the /var/lib/backuppc
 datastore. I'm using the same config files but it stopped honoring the
 $Conf{BackupFilesExclude} directive and I can't figure out what's going
 on.
 
 I do not have a $Conf{BackupFilesOnly} defined in either config.pl or the
 host config files. Here's a copy of one of the host config files:
 BEGIN
 #
 # SMB Backup (for Windows clients)
 #
 
 $Conf{XferMethod} = 'smb';
 
 $Conf{SmbShareName} = ['BACKUP$'];
 
 $Conf{BackupFilesExclude} = '/WINDOWS/';
 
 # if needed set a user name and password to access the remote shares
 $Conf{SmbShareUserName} = 'DOMAIN/AdminUser';
 $Conf{SmbSharePasswd} = 'password';
 END
 
 I have tried many variations of the BackupFilesExclude directive including:
 $Conf{BackupFilesExclude} = ['/WINDOWS' ];
 $Conf{BackupFilesExclude} = ['/WINDOWS/' ];
 $Conf{BackupFilesExclude} = ['/WINDOWS/*' ];
 $Conf{BackupFilesExclude} = '/WINDOWS';
 $Conf{BackupFilesExclude} = '/WINDOWS/*';
 etc..etc...
 
 No matter what I do it still backs up the WINDOWS directory [or any other
 directory I try to exclude].
 
 Any ideas?
 
 Thanks in advance for your help,
 
 DT
 
The answer is in the archives (watch out for line wrap):
http://article.gmane.org/gmane.comp.sysutils.backup.backuppc.general/5423/match=windows+smb+exclude

http://article.gmane.org/gmane.comp.sysutils.backup.backuppc.general/6311/match=windows+smb+exclude

-- 
Travis Fraser [EMAIL PROTECTED]



---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with BackupFilesExclude

2006-05-16 Thread Travis Fraser
On Tue, 2006-05-16 at 11:11 +0100, Tony Molloy wrote:
 Hi,
 
 I'm backing up a file share from a Windows box using the smb transfer 
 method. The problem is this share contains several very large log files 
 which the administrator of the windows box doesn't rotate or truncate. So 
 they are backed even when I do the nightly incremental backups.
 
 I've tried to use BackupFilesExclude to exclude the files but I can't seem 
 to get the syntax right or it's not working
 
 
 My per host config file is:
 
 $Conf{XferMethod} = 'smb';
 
 $Conf{SmbShareName} = 'Plone 2';
 $Conf{SmbShareUserName} = 'WWW1\admin';
 $Conf{SmbSharePasswd} = '';
 $Conf{BackupFilesExclude} = {
 'Plone 2' = ['/Data/log/*', '/Data/var/*']
 };
 
 So what I'm trying to do is backup the folder Plone 2 but exclude the 
 Data/log and Data/var subfolders.
 
 Thanks in advance.
 
 Tony
Use a backslash instead of the forward slash in the excludes.
-- 
Travis Fraser [EMAIL PROTECTED]



---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_nightly admin task failing

2006-05-16 Thread Travis Fraser
On Fri, 2006-05-05 at 14:05 -0400, Travis Fraser wrote:
 Hi,
 
 I have been running BackupPC-2.1.1 on a gentoo system for months and it
 has worked fine. Last week I edited the main config file to change the
 $Conf{RsyncArgs} to account for an upgrade to rsync-2.6.8 (I changed the
 flag --devices to -D). I restarted backuppc and since then the backups
 all go as scheduled, but I noticed the pool size (no compression)
 reported as 0 and these errors in the log file:
 
 2006-05-05 01:00:00 Running 2 BackupPC_nightly jobs from 0..15 (out of
 0..15)
 2006-05-05 01:00:00 Running BackupPC_nightly -m 0 127 (pid=25619)
 2006-05-05 01:00:00 Running BackupPC_nightly 128 255 (pid=25620)
 2006-05-05 01:00:00 Next wakeup is 2006-05-05 02:00:00
 2006-05-05 01:00:00  admin : Use of uninitialized value in chdir at 
 /usr/lib/perl5/5.8.7/File/Find.pm line 742.
 2006-05-05 01:00:00  admin : Use of chdir('') or chdir(undef) as chdir() is 
 deprecated at /usr/lib/perl5/5.8.7/File/Find.pm line 742.
 2006-05-05 01:00:00  admin : Use of uninitialized value in concatenation (.) 
 or string at /usr/lib/perl5/5.8.7/File/Find.pm line 743.
 2006-05-05 01:00:00  admin : Can't cd to : Permission denied
 2006-05-05 01:00:00 Finished  admin  (BackupPC_nightly -m 0 127)
 2006-05-05 01:00:00  admin1 : Use of uninitialized value in chdir at 
 /usr/lib/perl5/5.8.7/File/Find.pm line 742.
 2006-05-05 01:00:00  admin1 : Use of chdir('') or chdir(undef) as chdir() is 
 deprecated at /usr/lib/perl5/5.8.7/File/Find.pm line 742.
 2006-05-05 01:00:00  admin1 : Use of uninitialized value in concatenation (.) 
 or string at /usr/lib/perl5/5.8.7/File/Find.pm line 743.
 2006-05-05 01:00:00  admin1 : Can't cd to : Permission denied
 2006-05-05 01:00:00 Finished  admin1  (BackupPC_nightly 128 255)
 2006-05-05 01:00:00 Pool nightly clean removed 0 files of size 0.00GB
 2006-05-05 01:00:00 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max 
 links), 0 directories
 2006-05-05 01:00:00 Cpool nightly clean removed 0 files of size 0.00GB
 2006-05-05 01:00:00 Cpool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max 
 links), 0 directories
 2006-05-05 02:00:00 Next wakeup is 2006-05-05 03:00:00
 2006-05-05 03:00:01 Next wakeup is 2006-05-05 04:00:00
 
 I had upgraded perl perhaps a month ago from 5.8.6 to 5.8.7 and rebuilt
 all my perl modules. I searched the list archive but could not find
 anything close to my problem except this:
 http://article.gmane.org/gmane.comp.sysutils.backup.backuppc.general/6455/match=uninitialized+value+in+chdir

Replying to my own message here. I looked closer at my logs and noticed
that I had rebooted the backuppc server (kernel update) at least a day
before the admin oddities started showing in the logs, and that
BackupPC_nightly had run ok the night after the reboot.

I tried re-installing backuppc (leaving the data pool alone), and even
running BackupPC_nightly from the command line which ran with no errors.
Still the errors in the log and still the pool size reported as zero.

I stopped backuppc, ran a fsck which came up clean, and decided to
reboot as a last resort. Last night everything was normal.
-- 
Travis Fraser [EMAIL PROTECTED]



---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Rsync 2.6.7-1 (fileListReceive failed)

2006-05-16 Thread Travis Fraser

Peter Palfrader wrote:

On Thu, 30 Mar 2006, Andy wrote:

  
Most of my systems run Debian Sarge with rsync 2.6.4-6, but one runs 
Debian Etch and last night rsync updated to 2.6.7-1.


I havnt been able to do any backups of this box since. I get the 
following error in the xfer.log:




  

Got fatal error during xfer (fileListReceive failed)
Backup aborted (fileListReceive failed)



Same here.  I noticed that backuppc always stopped at special files
(sockets, named pipes etc):

2006-05-08 17:56:11 galaxy: overflow: flags=0x6f l1=111 l2=560294251, 
lastname=lib/gdm/.gdmfifo

The solution appears to be telling the remote rsync to pretty please
also transfer these special files:

| Conf{RsyncArgs} = [
| # original arguments here
| '--numeric-ids',
| '--perms',
| '--owner',
| '--group',
| '--devices',
| '--links',
| '--times',
| '--block-size=2048',
| '--recursive',
|
| # my args
| '--specials',  # ---
| '--one-file-system'
| ];

HTH.
  

You could also replace --devices with -D which also covers --specials.


---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_nightly admin task failing

2006-05-05 Thread Travis Fraser
Hi,

I have been running BackupPC-2.1.1 on a gentoo system for months and it
has worked fine. Last week I edited the main config file to change the
$Conf{RsyncArgs} to account for an upgrade to rsync-2.6.8 (I changed the
flag --devices to -D). I restarted backuppc and since then the backups
all go as scheduled, but I noticed the pool size (no compression)
reported as 0 and these errors in the log file:

2006-05-05 01:00:00 Running 2 BackupPC_nightly jobs from 0..15 (out of
0..15)
2006-05-05 01:00:00 Running BackupPC_nightly -m 0 127 (pid=25619)
2006-05-05 01:00:00 Running BackupPC_nightly 128 255 (pid=25620)
2006-05-05 01:00:00 Next wakeup is 2006-05-05 02:00:00
2006-05-05 01:00:00  admin : Use of uninitialized value in chdir at 
/usr/lib/perl5/5.8.7/File/Find.pm line 742.
2006-05-05 01:00:00  admin : Use of chdir('') or chdir(undef) as chdir() is 
deprecated at /usr/lib/perl5/5.8.7/File/Find.pm line 742.
2006-05-05 01:00:00  admin : Use of uninitialized value in concatenation (.) or 
string at /usr/lib/perl5/5.8.7/File/Find.pm line 743.
2006-05-05 01:00:00  admin : Can't cd to : Permission denied
2006-05-05 01:00:00 Finished  admin  (BackupPC_nightly -m 0 127)
2006-05-05 01:00:00  admin1 : Use of uninitialized value in chdir at 
/usr/lib/perl5/5.8.7/File/Find.pm line 742.
2006-05-05 01:00:00  admin1 : Use of chdir('') or chdir(undef) as chdir() is 
deprecated at /usr/lib/perl5/5.8.7/File/Find.pm line 742.
2006-05-05 01:00:00  admin1 : Use of uninitialized value in concatenation (.) 
or string at /usr/lib/perl5/5.8.7/File/Find.pm line 743.
2006-05-05 01:00:00  admin1 : Can't cd to : Permission denied
2006-05-05 01:00:00 Finished  admin1  (BackupPC_nightly 128 255)
2006-05-05 01:00:00 Pool nightly clean removed 0 files of size 0.00GB
2006-05-05 01:00:00 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max 
links), 0 directories
2006-05-05 01:00:00 Cpool nightly clean removed 0 files of size 0.00GB
2006-05-05 01:00:00 Cpool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max 
links), 0 directories
2006-05-05 02:00:00 Next wakeup is 2006-05-05 03:00:00
2006-05-05 03:00:01 Next wakeup is 2006-05-05 04:00:00

I had upgraded perl perhaps a month ago from 5.8.6 to 5.8.7 and rebuilt
all my perl modules. I searched the list archive but could not find
anything close to my problem except this:
http://article.gmane.org/gmane.comp.sysutils.backup.backuppc.general/6455/match=uninitialized+value+in+chdir
-- 
Travis Fraser [EMAIL PROTECTED]



---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] cygwin-rsyncd vs. backuppc-rsyncd

2005-11-21 Thread Travis Fraser
On Mon, 2005-11-21 at 12:41 -0600, Carl Wilhelm Soderstrom wrote:
 On 11/21 01:39 , Joe Hood wrote:
  I am having problem with Cygwin's rsyncd (tunneled through ssh) and
  did not know of a Backuppc rsyncd.
 
 last I knew, rsync+ssh didn't work properly on Windoze machines. Someone
 reported success  recently, but I don't know what they did differently.
 
I think Joe is using rsyncd, not rysnc with ssh transport. Rysnc does
not work with Cygwin ssh last I tried. It will work only for small
transfers and then crap out.
-- 
Travis Fraser [EMAIL PROTECTED]



---
This SF.Net email is sponsored by the JBoss Inc.  Get Certified Today
Register for a JBoss Training Course.  Free Certification Exam
for All Training Attendees Through End of 2005. For more info visit:
http://ads.osdn.com/?ad_id=7628alloc_id=16845op=click
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupFilesExclude is not working with SMB Transfer

2005-11-02 Thread Travis Fraser
On Wed, 2005-11-02 at 14:50 +0100, Christoph Pfund wrote:
 Hi,
  
 I tried both recomended methods to exclude some directories from
 backup. First i used the recomended notation $conf(BackupFilesExclude)
 = ['/temp/*']; with no success. The whole directory was copied. The
 second try was the standard notation $conf(BackupFilesExclude) =
 ['/temp']; with the same result. Is there a third way to set this
 conf. variable right?
  
This is from my previous post which I think should be in the archives:

I experimented a bit and found that the exclusions do work, and more
than one exclusion can be used. The trick was to prefix a backslash to
the directory or file that is excluded. Forward slashes did not work.
The exclusions are relative to the root of the share, of course.

I am using samba-3.0.14a on a Gentoo box, backing up Windows XP Pro and
Server 2003.

Some examples:

$Conf{BackupFilesExclude} = [ '\foo' ]

$Conf{BackupFilesExclude} = [ '\foo', '\*\bar' ]

Per-share excludes can be done as well:

$Conf{BackupFilesExclude} = { 
'share1$' = [ '\foo' , '\bar' ],
'share2$' = [ '\foobar' ], 
};

-- 
Travis Fraser [EMAIL PROTECTED]



---
SF.Net email is sponsored by:
Tame your development challenges with Apache's Geronimo App Server. Download
it for free - -and be entered to win a 42 plasma tv or your very own
Sony(tm)PSP.  Click here to play: http://sourceforge.net/geronimo.php
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Is there a way to schedule dump from command line

2005-09-06 Thread Travis Fraser
I have looked through the docs and found the command 

BackupPC_serverMesg backup all

to queue all hosts for backup, but was wondering how to do this for just
one host. I would like to be able to trigger the backup from the
server's console.
-- 
Travis Fraser [EMAIL PROTECTED]



---
SF.Net email is Sponsored by the Better Software Conference  EXPO
September 19-22, 2005 * San Francisco, CA * Development Lifecycle Practices
Agile  Plan-Driven Development * Managing Projects  Teams * Testing  QA
Security * Process Improvement  Measurement * http://www.sqe.com/bsce5sf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/