Raman Gupta wrote:
Another thing that would be useful (with or without the change above)
is if the RsyncArgs had the runtime variable substitution turned on
for at least the ConfDir and the host name. For example, I tried this:
$Conf{RsyncArgs} = [
[...],
'--exclude
(with or without the change above)
is if the RsyncArgs had the runtime variable substitution turned on
for at least the ConfDir and the host name. For example, I tried this:
$Conf{RsyncArgs} = [
[...],
'--exclude-from=$confDir/pc/$host.exclude',
];
but it did not work.
Cheers,
Raman
The backup for one of my hosts did not run two nights ago. I can't
find any indication of the reason why -- no errors at all. The logs
indicate the backup didn't even start. My other two hosts completed
successfully.
This is the first time this has happened in about three weeks, and
without me
Les Mikesell wrote:
Raman Gupta wrote:
The backup for one of my hosts did not run two nights ago. I can't
find any indication of the reason why -- no errors at all. The logs
indicate the backup didn't even start. My other two hosts completed
successfully.
This is the first time this has
I have three hosts configured to backup to my PC. Here are the speeds
from the host summary:
host 1: 24.77 GB, 14,000 files, 18.78 MB/s (slower WAN link)
host 2: 1.27 GB, 4,000 files, 1.89 MB/s (faster WAN link)
host 3: 4.82 GB, 190,000 files, 0.66 MB/s (fast LAN link)
They all use
Raman Gupta wrote:
I have three hosts configured to backup to my PC. Here are the speeds
from the host summary:
host 1: 24.77 GB, 14,000 files, 18.78 MB/s (slower WAN link)
host 2: 1.27 GB, 4,000 files, 1.89 MB/s (faster WAN link)
host 3: 4.82 GB, 190,000 files, 0.66 MB/s (fast
Walter Francis wrote:
One thing which I really need to be able to do is utilize the rsync exclude
files which I maintain for each of my backup sets. Is it possible to do this,
either via config file, or (ideally), via something like
/etc/backuppc/exclude/hostname.exclude where hostname is
On 03/05/2010 09:01 AM, Josh Malone wrote:
Also - you need a good filesystem to handle lots (or even not so many) of
backups. I reently switched from EXT3 to EXT4 and saw on order of magnitude
(I kid you not, 10+ hours to 1) reduction in the backup time and system
load. Unfortunately, I think
On 04/27/2010 02:39 PM, Les Mikesell wrote:
On 4/27/2010 8:09 AM, Tyler J. Wagner wrote:
On Tuesday 27 April 2010 05:24:04 chitowner watertower wrote:
I would like to suggest- if any dev team members happen to see this- that
it would be a nifty option if some kind of progress indicator
On 05/11/2010 02:52 PM, Nick Bright wrote:
Precisely correct. I removed those port arguments and set the SSH server
on the target machine back to port 22 and it's working. Now I need to
figure out how to properly tell BackupPC that SSH is on a non-standard
port. I thought that I was doing it
On 01/10/2011 04:41 AM, mohammad tayebi wrote:
*Hi Backuppc Users*
i have problem ?
my backupc server has Raid 5 And mounted /var/lib/backuppc
ofcource my quesition is : This Log
* 2011-01-09 20:00:03 Got fatal error during xfer (aborted by signal=PIPE)*
There are many reasons this
On 09/18/2011 10:29 PM, Adam Monsen wrote:
AHA, adding the following entry to BackupFilesOnly '*' worked:
/opt/backup/stage/other/*
Now that directory is backed up. But I don't understand why. :)
Because
/opt/backup/stage/other
backs up the directory entry itself, but not its contents.
On 09/19/2011 01:04 PM, Bowie Bailey wrote:
On 9/19/2011 11:16 AM, Raman Gupta wrote:
On 09/18/2011 10:29 PM, Adam Monsen wrote:
AHA, adding the following entry to BackupFilesOnly '*' worked:
/opt/backup/stage/other/*
Now that directory is backed up. But I don't understand why
For disaster recovery purposes, I have been periodically backing up my
BackupPC pool to external storage. I have a small pool of
approximately 300 GB on a Linux server, and currently use rsync to
copy the pool to storage and keep it updated.
I am considering moving my DR backup to an online
On 06/25/2013 12:05 PM, Carl Wilhelm Soderstrom wrote:
On 06/25 11:55 , Raman Gupta wrote:
For disaster recovery purposes, I have been periodically backing up my
BackupPC pool to external storage. I have a small pool of
approximately 300 GB on a Linux server, and currently use rsync to
copy
I have a client on which about 100 GB of data has been moved from one
directory to another -- otherwise its exactly the same.
As I understand it, since the data has been moved, BackupPC 3 will
transfer all the data again (and discard it once it realizes the data
is already in the pool) i.e. it
On 08/20/2013 03:27 PM, John Rouillard wrote:
On Tue, Aug 20, 2013 at 02:23:38PM -0400, Raman Gupta wrote:
I have a client on which about 100 GB of data has been moved from one
directory to another -- otherwise its exactly the same.
As I understand it, since the data has been moved, BackupPC
On 08/20/2013 03:28 PM, Arnold Krille wrote:
On Tue, 20 Aug 2013 14:23:38 -0400 Raman Gupta rocketra...@gmail.com
wrote:
I have a client on which about 100 GB of data has been moved from one
directory to another -- otherwise its exactly the same.
As I understand it, since the data has been
yourself a favor and switch your router to use the OpenDNS or
Google DNS servers. You'll probably get better performance, as well as
real NXDOMAIN responses.
Regards,
Raman Gupta
--
Learn the latest--Visual Studio 2012
Here is version 0.1.4 from 2009:
https://gist.github.com/rocketraman/ebce662290da354222c2
I don't know if it is the latest that was available before the wiki
disappeared.
Regards,
Raman
On 02/07/2015 11:47 AM, Carl T. Miller wrote:
When I searched for BackupPC_DeleteFile.pl it appears
that
re-create the problem? It would be great to track down what went
> wrong. (I still need to look at your log file snippets in more detail,
> hopefully this weekend.)
>
> Craig
>
> On Wed, May 8, 2019 at 10:24 PM Raman Gupta wrote:
>>
>> So the issue was definitely with t
backup. Once this was done, a
manually triggered full backup then finally completed with no issues.
Regards,
Raman
On Wed, May 8, 2019 at 11:32 AM Raman Gupta wrote:
>
> The backup is large, but not huge. I created a new test host pointing
> only to /home/raman/x and it worked just fine.
Hi guys... I just ran into an interesting situation. I was looking for
some files from my backup, and noticed that nothing had been backed up
in my home directory since late 2017.
Upon investigation, it looks like the Fedora packages started adding
"--one-file-system" by default in the
Certain directories (and their contents) on one of my hosts are not
getting backed up at all, even with a "Full" backup.
I use rsync as my Xfer method, with BackupPC 4.3.0 on Fedora (rpms
BackupPC-4.3.0-1.fc29.x86_64, BackupPC-XS-0.58-1.fc29.x86_64).
Looking at the backup logs, I see messages
tion) and looking in the
> XferLOG file. When the initial file list is sent, are those directories and
> their contents present in the file list?
>
> Craig
>
> On Tue, May 7, 2019 at 4:26 PM Michael Stowe
> wrote:
>>
>> On 2019-05-07 13:39, Raman Gupta wrote:
&
ing on.
>> >
>> > I'd recommend turning on additional debug in rsync (eg, add -vvv to
>> > $Conf{RsyncArgs}, and also look at the --debug option) and looking in the
>> > XferLOG file. When the initial file list is sent, are those directories
>> > and t
I believe you can just rename the host in the config file, and rename
the host directory under `pc` in the backuppc data directory.
After you rename the hosts to whatever makes sense, use
ClientNameAlias to ensure the renamed host points to the *real*
hostname. See
27 matches
Mail list logo