Les Mikesell wrote at about 08:05:33 -0500 on Thursday, June 7, 2012:
On Thu, Jun 7, 2012 at 3:42 AM, Tyler J. Wagner ty...@tolaris.com wrote:
On 2012-06-06 17:45, Jeffrey J. Kosowsky wrote:
I am able to achieve most of what you wish by doing most of my
customization in the individual
Adam Goryachev wrote at about 21:14:52 +1000 on Wednesday, June 6, 2012:
Even better, would be if backuppc could support reading config file
snippets from a directory, that way all the local changes could be
stored in separate files, and the package could upgrade the config.pl
file without
Timothy J Massey wrote at about 12:05:02 -0400 on Thursday, May 31, 2012:
Pascal Mosimann pascal.mosim...@alternatique.ch wrote on 05/31/2012
11:12:54 AM:
Why does BackupPC run the parity command if I've told it not to by
passing
it a 0? And how do I return to the 3.1
See below for comments...
RYAN M. vAN GINNEKEN wrote at about 15:45:40 -0600 on Wednesday, May 23, 2012:
I'm bumping this thread again as it seems deleting must be
possible but i keep getting this error for these file
Well, I am the author of that script
$
Gerry George wrote at about 16:38:47 -0400 on Thursday, May 17, 2012:
Actually this coincides with an idea I had for using BackupPC for use as a
backup service. It would have to operate differently to the standard
configuration, though. The system I envisioned was as follows:
-
Gerry George wrote at about 10:27:04 -0400 on Friday, May 18, 2012:
On Fri, May 18, 2012 at 9:55 AM, Jeffrey J. Kosowsky
backu...@kosowsky.orgwrote:
Gerry George wrote at about 16:38:47 -0400 on Thursday, May 17, 2012:
Actually this coincides with an idea I had for using BackupPC
One simple possibility would be to use backuppc-fuse to mount the pc
tree and then use normal *nix routines like diff or cmp to find
differences.
If you only care to know which files differ, rather than how and if
you are using the rsync/rsyncd transfer method you could write a
custom perl
See the archives or the Wikki - I have written routines that check the
embedded md4sum checksums (available with rsync transfer method after
the second time a file is backed up) against the file contents. This
checks integrity.
I have also written a routine that adds the md4 checksum to files
Kyle Anderson wrote at about 11:10:26 -0500 on Thursday, March 1, 2012:
I have BackupPC_digestVerify.pl, but I don't understand if this can do
what I'm asking.
This tool looks like it adds and verifies the sums like you say, but can
it actually tell me what the sum is from a known
Brad Alexander wrote at about 14:57:41 -0500 on Friday, February 24, 2012:
Hey all,
I'm running into a problem migrating my /var/lib/backuppc pc
directory. I got cpool, log, pool, tmp, and trash migrated via rsync,
and I am attempting to migrate the pc directory.
It's really not
Brad Morgan wrote at about 16:31:09 -0700 on Thursday, February 23, 2012:
I've also seen a couple of useful scripts (BackupPC_copyPcPool.pl,
BackupPc_deleteFile.pl) and a jLib.pm but I haven't seen any documentation
about how and where to install these files. Could someone point me in the
PLEASE DON'T TOP-POST - it makes it nearly impossible to follow the
thread, especially with multiple people chiming in
Zach Lanich wrote at about 02:01:53 -0500 on Thursday, February 16, 2012:
This is all the Log has in it when I try rsync:
2012-02-15 21:59:02 full backup started for
Ingo P. Korndoerfer wrote at about 11:04:27 +0100 on Monday, February 13, 2012:
hello,
i have been going around in circles and pretty much grazed all i could
find on google and then finally found a way to
get this to work, and though it might be worth communicating this, so it
can
Ingo P. Korndoerfer wrote at about 13:19:59 +0100 on Monday, February 13, 2012:
hello,
here comes the next question i could not find answered anywhere. please
fee free to just point me to older posts
if this has been discussed a 1000 times already ...
so i have different directory
Fred Warren wrote at about 09:13:43 -0800 on Monday, February 13, 2012:
I would like to run backup-pc on site and keep a duplicate copy offisite.
So I want 2 backup-pc servers. One onsite and one offsite. With the
offsite copy not running, but the data being synced with the onsite copy.
Timothy J Massey wrote at about 16:51:34 -0500 on Thursday, February 9, 2012:
Hello!
I've set up a new backup server, and for the first time I haven't disabled
compression. BackupPC is now creating log files in (what it is claiming
is) .z format. How do I read these? I've tried
Timothy J Massey wrote at about 17:40:43 -0500 on Thursday, February 9, 2012:
Bowie Bailey bowie_bai...@buc.com wrote on 02/09/2012 05:01:32 PM:
On 2/9/2012 4:51 PM, Timothy J Massey wrote:
Hello!
I've set up a new backup server, and for the first time I haven't
disabled
Timothy J Massey wrote at about 09:52:03 -0500 on Friday, February 10, 2012:
Jeffrey J. Kosowsky backu...@kosowsky.org wrote on 02/10/2012 08:55:34
AM:
Timothy J Massey wrote at about 17:40:43 -0500 on Thursday, February 9,
2012:
At the price of making it, at a very minimum, very
Timothy J Massey wrote at about 09:41:53 -0500 on Friday, February 10, 2012:
I usually monitor backups (especially when I've just created a new guest
to back up or when I'm having problems) by tail -f /path/to/XferLog. I
can't do that with these compressed log files (or, I can't figure
Timothy J Massey wrote at about 11:27:36 -0500 on Friday, February 10, 2012:
Jeffrey J. Kosowsky backu...@kosowsky.org wrote on 02/10/2012 10:33:38
AM:
The point is that both 'compress' and BackupPC use 'zlib' compression,
hence rather than creating some non-standard new suffix, Craig
Rob Hasselbaum wrote at about 11:37:49 -0500 on Friday, February 10, 2012:
On Fri, Feb 10, 2012 at 10:46 AM, Les Mikesell lesmikes...@gmail.comwrote:
I don't know anything about tarsnap but it looks like it has its own way
of tracking incremental changes. Is there some reason you can't
Lars Tobias Skjong-Børsting wrote at about 17:41:05 +0100 on Thursday, February
9, 2012:
Hi,
On 3/27/09 7:44 PM, Paul Mantz wrote:
On Thu, Mar 26, 2009 at 8:01 PM, o...@jltechinc.como...@jltechinc.com
wrote:
Is it possible to get a CVS copy?
I tried: cvs -z3.2
Les Mikesell wrote at about 07:40:20 -0600 on Wednesday, February 1, 2012:
On Wed, Feb 1, 2012 at 2:30 AM, Kimball Larsen quang...@gmail.com wrote:
Do any
have local time machine backups that might be included?
No, time machine is on external drives, specifically excluded from
Kimball Larsen wrote at about 09:30:38 -0700 on Wednesday, February 1, 2012:
I just wanted to follow up with a description of what I changed to solve
this:
First off, the users with performance problems on their machines
during backups all had a copy of Parallels (Windows emulation
Flako wrote at about 11:47:40 -0300 on Saturday, January 28, 2012:
Hello
I'm trying to use shadowmountrsync 0.4.5.3 on Windows XP SP3 and cygwin
2.763.
Sshd and rsyncd services are working properly.
Commands: shadowmountrsync-u 2 and shadowmountrsync-d work
properly if run locally
smallpox wrote at about 07:15:07 -0800 on Friday, January 20, 2012:
i was under the impression that rsync does the comparison with little or
no bandwidth.
First, PLEASE DON'T TOP-POST - it makes following and responding
to a thread near impossible.
Second, what makes you think the issue is
Till Hofmann wrote at about 16:23:56 +0100 on Friday, January 20, 2012:
Hello everybody,
since my backup partition is on a RAID5 which doesn't do anything but
keeping my backups, I want the hard drives to automatically spin down
(standby) when there is nothing to do.
It's working
Stefan Peter wrote at about 21:00:22 +0100 on Friday, January 20, 2012:
On 01/20/2012 08:49 PM, Jeffrey J. Kosowsky wrote:
In summary, I think you are trying to solve a problem that may not
need to be solved, using a tool that is not meant to solve it, without
understanding what
The laptops I back up have both a wired and wireless Ethernet
connection with (different) MAC addresses.
I use static DNS so that when the laptops are attached at home they
are given a fixed (known) IP address so that BackupPC can find them
using my /etc/hosts file.
On my old D-Link router, I
Timothy J Massey wrote at about 17:13:44 -0500 on Monday, January 16, 2012:
Peter Thomassen m...@peter-thomassen.de wrote on 01/16/2012 12:31:05 AM:
On 01/11/2012 08:00 PM, Timothy J Massey wrote:
I would add this: 45 GB and 185,000 files is, in my opinion, far from
big.
I have
Timothy J Massey wrote at about 17:26:43 -0500 on Monday, January 16, 2012:
Jeffrey J. Kosowsky backu...@kosowsky.org wrote on 01/16/2012 05:00:45
PM:
The problem is that my new Verizon router does not allow the same IP
address to be correlated with different MAC addresses.
So
Jeffrey J. Kosowsky wrote at about 17:00:45 -0500 on Monday, January 16, 2012:
The laptops I back up have both a wired and wireless Ethernet
connection with (different) MAC addresses.
I use static DNS so that when the laptops are attached at home they
are given a fixed (known) IP
Les Mikesell wrote at about 17:57:27 -0600 on Monday, January 16, 2012:
On Mon, Jan 16, 2012 at 5:32 PM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
1) Use a better (real) Firewall
That would be, by *FAR*, the best solution. Other solutions:
2) Use a better
Daniel wrote at about 21:25:02 +0100 on Saturday, January 7, 2012:
Only this deletion problem... This should be top priority, and should
already be in the software :-) (I mean I know it is open source etc
and that I did not develop it, but it's a feature I think many of us
would like and
Ralph Weaver wrote at about 17:31:19 -0500 on Wednesday, January 4, 2012:
We ran into an issue with a backuppc restore where we had unknown files
causing issues due to existance. We tried to restore thinking it would
remove the extraneous files as a normal rsync with the --delete option
upen wrote at about 13:10:17 -0600 on Tuesday, January 3, 2012:
Hi,
I find some of expired backups in Trash directory. What is the correct
way to restore data from directories under 'trash' I found
For example, I found directory '1325037949_12970_0/f%2fexport%2fhome'
in trash which I
JP Vossen wrote at about 21:50:29 -0500 on Wednesday, December 21, 2011:
I'm running Debian Squeeze stock backuppc-3.1.0-9 on a server and I'm
getting kernel messages [1] and SMART errors [2] about the WD 2TB SATA
disk. Fine, I RMA'd it and have the new one... Now what? I know I can
Mark Maciolek wrote at about 13:37:27 -0500 on Thursday, December 15, 2011:
On 12/15/2011 1:31 PM, Zach La Celle wrote:
We just upgraded our backup machine and are using an external USB3 hard
drive for backups.
Last night, something went wrong, and when I got in this morning I saw
Arnold Krille wrote at about 18:10:03 +0100 on Friday, December 2, 2011:
On Friday 02 December 2011 17:33:41 Igor Sverkos wrote:
Hi,
today I browsed through the backup data folder. Is it normal that
folders look like
/var/lib/BackupPC/pc/foo.example.org/252/f%2f/fetc
Tim Fletcher wrote at about 21:21:45 + on Thursday, November 10, 2011:
On Thu, 2011-11-10 at 15:56 -0500, SSzretter wrote:
It would be great if a flag could be set to tell backuppc to only
backup a machine if it is in a specific subnet range (192.168.2.x) and
to skip it if
Steve M. Robbins wrote at about 22:19:44 -0500 on Monday, October 24, 2011:
Hi,
One thing that all these methods have in common is that they scan the
entire pool filesystem. I accept that I will have to do that at
least initially. However, to send daily updates, it seems unnecessary
Gail Hardmann wrote at about 14:11:40 +0200 on Thursday, October 20, 2011:
Hi experts
I am not a Linux expert, but I've succeeded in installing BackupPC 3.2.1 on
a DNS-323 NAS (running an ARM GNU Linux distribution).
BackupPC seems to be working - I can also access it from the CGI
Holger Parplies wrote at about 17:54:05 +0200 on Thursday, October 6, 2011:
Hi,
Tim Fletcher wrote on 2011-10-06 10:17:03 +0100 [Re: [BackupPC-users] Bad
md5sums due to zero size (uncompressed) cpool files - WEIRD BUG]:
On Wed, 2011-10-05 at 21:35 -0400, Jeffrey J. Kosowsky wrote
Holger Parplies wrote at about 02:45:56 +0200 on Friday, October 7, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-10-06 19:28:38 -0400 [Re: [BackupPC-users]
Bad md5sums due to zero size (uncompressed)?cpool files - WEIRD BUG]:
Holger Parplies wrote at about 17:54:05 +0200 on Thursday
Jeffrey J. Kosowsky wrote at about 18:58:51 -0400 on Tuesday, October 4, 2011:
After the recent thread on bad md5sum file names, I ran a check on all
my 1.1 million cpool files to check whether the md5sum file names are
correct.
I got a total of 71 errors out of 1.1 million files:
- 3
Holger Parplies wrote at about 05:46:36 +0200 on Friday, October 7, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-10-06 22:54:44 -0400 [Re: [BackupPC-users]
Bad md5sums due to zero size (uncompressed) cpool?files - WEIRD BUG]:
OK... this is a little weird maybe...
[...]
On all
Holger Parplies wrote at about 17:41:48 +0200 on Wednesday, October 5, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-10-04 18:58:51 -0400 [[BackupPC-users] Bad
md5sums due to zero size (uncompressed) cpool files - WEIRD BUG]:
After the recent thread on bad md5sum file names, I ran
After the recent thread on bad md5sum file names, I ran a check on all
my 1.1 million cpool files to check whether the md5sum file names are
correct.
I got a total of 71 errors out of 1.1 million files:
- 3 had data in it (though each file was only a few hundred bytes
long)
- 68 of the 71 were
Gail Hardmann wrote at about 12:24:00 +0300 on Saturday, October 1, 2011:
[BackupPC-users] BUG SOLUTION: Can't call method getStats on an
undefined value
Jeffrey: I have encountered exactly the same problem. Thank you for your
bug solution.
I am trying to run BackupPC on a
Holger Parplies wrote at about 05:39:15 +0200 on Sunday, October 2, 2011:
Mike Dresser wrote on 2011-09-29 14:11:20 -0400 [[BackupPC-users] Fairly
large backuppc pool (4TB) moved with backuppc_tarpccopy]:
[...] Did see a few errors, all of them were related to the attrib files,
similar
Mike Dresser wrote at about 10:51:14 -0400 on Sunday, October 2, 2011:
Can you point me to those? My method was to rsync the everything but the
pc dir, and then used backuppc_tarpccopy to create a tar file of the
hardlinks.. I probably could have saved a few days by directly extracting
Mike Dresser wrote at about 15:44:32 -0400 on Sunday, October 2, 2011:
On Sun, 2 Oct 2011, Jeffrey J. Kosowsky wrote:
If you want to troubleshoot, I would do the following:
I'm currently running 3.1.0, so that probably answers why I'm seeing
these. Thought I was on 3.2 for some
Holger Parplies wrote at about 04:48:10 +0200 on Monday, October 3, 2011:
You might want to use parts of either Jeffrey's or my script. Jeffrey builds
a
pool of information which files use which inode. I build a file with pretty
much the same information. Both don't need to be stored on
Tim Connors wrote at about 11:15:31 +1000 on Thursday, September 29, 2011:
On Wed, 28 Sep 2011, Timothy J Massey wrote:
Arnold Krille arn...@arnoldarts.de wrote on 09/28/2011 11:20:57 AM:
I'm sure someone with more shell-fu will give you a much better
command
line (and I
Timothy J Massey wrote at about 10:30:18 -0400 on Wednesday, September 28, 2011:
Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:
I need to search for a specific file on a host, via backuppc. Is
there a way to search a host backup, so I don't have to manually go
Matthias Meyer wrote at about 00:04:50 +0200 on Saturday, October 1, 2011:
Hi (Jeff ;-)
I would like to try your BackupPC_copyPCPool.pl to backup my BackupPC
storage to another server.
Unfortunately this other server have no BackupPC installed.
I've copied FileZIO.pm, Lib.pm,
maar...@tepaske.net wrote at about 10:29:16 +0200 on Monday, September 26, 2011
So I am currently writing some scripts for my backup needs. Which made
me wonder, BackupPC essentially starts a backup like this:
/usr/bin/ssh -4 -q -l backuppc host sudo /usr/bin/rsync --server --sender
Tim Fletcher wrote at about 19:53:25 +0100 on Monday, September 26, 2011:
On Mon, 2011-09-26 at 15:37 +0200, Maarten te Paske wrote:
OK, I will read a bit more into the rsync documentation. I thought this
way I wouldn't be able to limit the privileges through sudo, but maybe
I'm
Markus Fröhlich wrote at about 18:43:01 +0200 on Thursday, September 22, 2011:
backupPC processes run as user wwwrun - this is the apache user -
because of the permissions making the configuration over the webinterface.
the archive request get startet over a cronjob and a small skript once
James L. Evans wrote at about 13:50:38 -0400 on Monday, September 12, 2011:
After further experimenting, it appears that with squeeze (unlike lenny)
that running BackupPC_nightly and BackupPC_dump at the same time is the
problem. After changing the schedule so that BackupPC_nightly runs
Matthias Meyer wrote at about 00:21:25 +0200 on Wednesday, September 14, 2011:
Jeffrey J. Kosowsky wrote:
Matthias Meyer wrote at about 15:29:09 +0200 on Sunday, September 11,
2011:
Dear all,
I've a problem by backing up a large file (9GB) because the internet
Matthias Meyer wrote at about 15:29:09 +0200 on Sunday, September 11, 2011:
Dear all,
I've a problem by backing up a large file (9GB) because the internet
connection of the client interrupts every 24 hours.
BackupPC (V3.1.0) can rsync this file once with status:
md4 doesn't match:
hans...@gmail.com wrote at about 20:09:02 +0700 on Friday, September 9, 2011:
I realize that, and thought my posting details on my precautionary
procedures would sufficiently demonstrate my awareness of the fact
that I'm living on the edge.
Please take this as constructive criticism, but
Tyler J. Wagner wrote at about 11:58:00 +0100 on Friday, September 9, 2011:
On 2011-09-05 17:38, Jeffrey J. Kosowsky wrote:
You probably want to read the documentation under --help (and also
perhaps at the head of the executable). But you probably want to use
the --fixlinks|-f option
hans...@gmail.com wrote at about 21:28:24 +0700 on Friday, September 9, 2011:
Anyway, I'll try to keep quiet for a while, in particular not
discussing my ideas to have bootable HDDs implementing BackupPC-based
per-client personal Time Machines. 8-)
Can anyone suggest a more suitable
Kenneth Porter wrote at about 14:35:08 -0700 on Monday, September 5, 2011:
My client Windows XP boxes are failing to register with my WINS server
(running nmbd from Samba). I'm puzzled how to figure out what I'm doing
wrong.
I'm setting up BackupPC to back up my Windows clients using
Tyler J. Wagner wrote at about 09:46:39 +0100 on Monday, September 5, 2011:
On 2011-09-02 18:06, Jeffrey J. Kosowsky wrote:
Why do you assume something is wrong with how you are using the
program?
The error message is saying that you have a bunch of files in the pc
tree
hans...@gmail.com wrote at about 14:18:41 +0700 on Saturday, September 3, 2011:
On Sat, Sep 3, 2011 at 11:09 AM, Timothy J Massey tmas...@obscorp.comwrote:
But would probably be a very good idea. What would be an even better idea
would be to grab a spare PC (or a virtual guest) and
Tyler J. Wagner wrote at about 17:40:16 +0100 on Friday, September 2, 2011:
Hi all (well, Jeff, really),
I'm trying to copy a very large pool (1 TB, 70 hosts, about 20 backups
each) to another server with a bigger disk array. I'm using
BackupPC_copyPcPool. However, on running it I get a
Timothy J Massey wrote at about 10:43:37 -0400 on Friday, September 2, 2011:
Your old backups should be 100% fine. They will remain in the pool just
fine, etc. I do not believe that files transferred by rsync will pool
with files transferred by tar (due to the attribute issue you
Carl Wilhelm Soderstrom wrote at about 07:51:10 -0500 on Thursday, August 25,
2011:
On 08/25 07:50 , Brad Alexander wrote:
Really a small thing, but when doing a restore, and you save as a .zip or
.tar, instead of defaulting to a generic and non-descriptive filename of
Carl Wilhelm Soderstrom wrote at about 08:31:01 -0500 on Thursday, August 25,
2011:
On 08/25 09:23 , Jeffrey J. Kosowsky wrote:
I would make it consistent with the heirarchy:
hostname-backup#-share
I'm not sure what date adds since the date is irrelevant unless you
are referring
jiahwei wrote at about 20:27:09 -0700 on Sunday, August 21, 2011:
But somehow it doesn't backup my Home directory, even though I tried
indicate /home to the backup/
WHAT??? This sentence fragment means nothing... What is 'it'? What are
you talking about?
Are you replying to some other
Jonathan Schaeffer wrote at about 15:17:15 +0200 on Friday, August 12, 2011:
Hello,
this is not direcly connected to backuppc itself, but I thought I would
find people here with a good experience of my problem.
I am backing up windows clients to a central server using the shadow
David Uhlmann wrote at about 11:55:57 +0200 on Thursday, August 11, 2011:
Dear all,
I want to run a Backup to 2 NAS-Systems, attached via NFS. Because BackupPC
can do backups to only one folder, my idea is this:
Create Directory /backup
Run a Cron Job to mount the NAS via NFS,
ft oppi wrote at about 13:05:07 +0200 on Thursday, August 11, 2011:
Hello list,
I've read the wiki and part of the list, but the solutions described there
don't satisfy me completely, so I'm looking for something else.
I have two old Linux servers running BackupPC 3.1.0 and I need to
friedmann wrote at about 06:19:35 -0700 on Tuesday, August 9, 2011:
I also noticed a good clue, pointing out the number of subdirectories
hence, i have counted the number of subdirectories that are on the backuped
drive, using : find . -type d | wc -l, that returned me :
82779
Well
Les Mikesell wrote at about 12:21:54 -0500 on Monday, August 8, 2011:
On 8/8/2011 9:30 AM, Phil K. wrote:
Look in to using the nolock option when mounting the device. Rrdtool
can't get a file lock, which is likely causing your performance issues.
I have the same device, and had similar
Holger Parplies wrote at about 19:34:27 +0200 on Monday, August 8, 2011:
Hi,
as Jeffrey said, we'll need meaningful information to give meaningful
answers.
Oh my goodness, did Holger Top Quote? say it isn't so :P
One thing I can answer, though:
friedmann wrote on 2011-08-08
Les Mikesell wrote at about 16:59:45 -0500 on Monday, August 8, 2011:
On 8/8/2011 3:28 PM, Jeffrey J. Kosowsky wrote:
FYI, On 32-bit Fedora 12/Linux 2.6.32:
Ext2/3: MAX=32000
Ext4: MAX=65000
This presumably should be true more generally for any relatively
non
Holger Parplies wrote at about 23:59:49 +0200 on Monday, August 8, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-08-08 16:28:28 -0400 [Re: [BackupPC-users]
Too many links : where is the problem ?]:
Holger Parplies wrote at about 19:34:27 +0200 on Monday, August 8, 2011:
Hi
Les Mikesell wrote at about 17:43:41 -0500 on Monday, August 8, 2011:
On 8/8/2011 5:33 PM, Jeffrey J. Kosowsky wrote:
Les Mikesell wrote at about 16:59:45 -0500 on Monday, August 8, 2011:
On 8/8/2011 3:28 PM, Jeffrey J. Kosowsky wrote:
FYI, On 32-bit Fedora 12
Jeffrey J. Kosowsky wrote at about 16:28:28 -0400 on Monday, August 8, 2011:
Holger Parplies wrote at about 19:34:27 +0200 on Monday, August 8, 2011:
Hi,
as Jeffrey said, we'll need meaningful information to give meaningful
answers.
Oh my goodness, did Holger Top Quote
Holger Parplies wrote at about 02:03:42 +0200 on Tuesday, August 9, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-08-08 19:41:18 -0400 [Re: [BackupPC-users]
Too many links : where is the problem ?]:
Les Mikesell wrote at about 17:43:41 -0500 on Monday, August 8, 2011:
On 8/8/2011 5
Holger Parplies wrote at about 02:58:57 +0200 on Tuesday, August 9, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-08-05 12:08:47 -0400 [Re: [BackupPC-users]
How do I configure DumpPreUserCmd to have?multiple?commands]:
You are exactly right that 'bash -c blah blah blah' fails since
Holger Parplies wrote at about 03:06:23 +0200 on Tuesday, August 9, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-08-08 16:15:38 -0400 [Re: [BackupPC-users]
NFS woes]:
Les Mikesell wrote at about 12:21:54 -0500 on Monday, August 8, 2011:
On 8/8/2011 9:30 AM, Phil K. wrote
Jeffrey J. Kosowsky wrote at about 19:42:41 -0400 on Monday, August 8, 2011:
Jeffrey J. Kosowsky wrote at about 16:28:28 -0400 on Monday, August 8, 2011:
Holger Parplies wrote at about 19:34:27 +0200 on Monday, August 8, 2011:
Hi,
as Jeffrey said, we'll need meaningful
Steve wrote at about 19:15:20 -0400 on Monday, August 8, 2011:
On Mon, Aug 8, 2011 at 6:39 PM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
Finally, out of curiosity, I grepped the BackupPC code base for the
error language too many links cited verbatimu by the OP and found
Brad Alexander wrote at about 16:41:44 -0400 on Friday, August 5, 2011:
As the subject posits, is it possible to issue a dump pre- or post-command
only on certain types of backups? For instance, we run bacula at work, and
apparently the director states what kind of backup is running, either
Holger Parplies wrote at about 15:01:45 +0200 on Friday, August 5, 2011:
Hi,
Jeffrey J. Kosowsky wrote on 2011-08-04 01:37:23 -0400 [Re: [BackupPC-users]
How do I configure DumpPreUserCmd to have multiple?commands]:
Rory Toma wrote at about 18:15:32 -0700 on Wednesday, August 3, 2011
Jeffrey J. Kosowsky wrote at about 11:36:13 -0400 on Friday, August 5, 2011:
Holger Parplies wrote at about 15:01:45 +0200 on Friday, August 5, 2011:
All I can say is that the form bash -c code... does work.
I use the following 'monstrosity' to query if rsyncd is running and
start cygwin
Rory Toma wrote at about 18:15:32 -0700 on Wednesday, August 3, 2011:
It appears that if I feed it a list of ; separated commands, that it
executes the first command, and assumes everything else is an argument.
In my case, I want to do something like:
rsh -n machine command; rsh -n
Holger Parplies wrote at about 03:29:00 +0200 on Wednesday, July 20, 2011:
Hi,
sorry for not replying earlier. In case you're still wondering (otherwise for
the archives) ...
Jeffrey J. Kosowsky wrote on 2011-02-07 14:15:05 -0500 [[BackupPC-users]
*BUMP* *BUMP* Re: BackupPC perl
Lib.pm has a bug that appears twice:
The 2 occurrences of:
$cmd = join( , $cmd) if ( ref($cmd) eq ARRAY );
should be replaced by:
$cmd = join( , @$cmd) if ( ref($cmd) eq ARRAY );
Otherwise you are joining array refs rather than arrays which is
meaningless/error.
Also, I would suggest
Alain Péan wrote at about 17:41:40 +0200 on Wednesday, July 27, 2011:
Cygwin is another method to access the data, using rsync or rsyncd. In
fact, BackupPC can use four methods to backup the PCs : smb (for
windows), rsync, rsyncd, or FTP. You can configure the method you want
for each
Whenever you are doing a full backup and DumpPreShareCmd fails, I get the
following error in my log:
Can't call method getStats on an undefined value at
/usr/share/BackupPC/bin/BackupPC_dump line 1160.
I posted a similar bug report back in December, but now I believe I
have figured out
Richard Shaw wrote at about 16:40:10 -0500 on Wednesday, July 27, 2011:
On Wed, Jul 27, 2011 at 2:42 PM, C. Ronoz chro...@eproxy.nl wrote:
Depending on how comfortable you are building your own packages,
Fedora has 3.2.1 almost ready to go. We had to package two perl
modules for the
Arch 32
Thanks!
Richard Shaw wrote at about 19:24:03 -0500 on Thursday, July 28, 2011:
On Thu, Jul 28, 2011 at 4:44 PM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
Any chances of backporting this to older Fedora versions? (I still run
Fedora 12)
It wouldn't take me very long
-- so I
thin in my 3.2.0 version, I might have used the FC12 or FC13 version
rathern than FC14+ though my memory is a bit foggy there...
Richard Shaw wrote at about 19:33:25 -0500 on Thursday, July 28, 2011:
On Thu, Jul 28, 2011 at 4:44 PM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
Any
Holger Parplies wrote at about 00:34:56 +0200 on Tuesday, July 12, 2011:
Well I hope you don't have many files ... how about either
'chown -R backuppc:backuppc /archive' (assuming that's TopDir) - there are no
files under TopDir *not* belonging to backuppc, or at least there shouldn't
be,
1 - 100 of 777 matches
Mail list logo