[BackupPC-users] Disk space/Target Issue

2020-11-02 Thread Steve Zemlicka
I have just spun up a new instance of BackupPC and am wanting to
backup my new FreeNAS to my old QNAP.  I have NFS setup and the
location on the QNAP mounted to /mnt/backups.  How do I create a
symlink to direct the backups to the NFS mount?

I thought I had done this but it ended up being backwards and filling
up my disk.  I've also tried moving the PC folder and creating the
symlink from /var/lib/backuppc/pc to /mnt/backups/backuppc/pc but this
broke the service with the following error

(Unfortunately, it seems as though journalctl -xe and systemctl are
not word wrapping but the jist of the error is as follows:)

Nov 02 15:28:46 bdr01 backuppc[929]: Starting backuppc...2020-11-02
15:28:46 Can't create a test hardlink between a file in
/var/lib/backuppc/pc and /var/lib/backuppc/cpool.  Either these are
different file systems, or this file system
Nov 02 15:28:46 bdr01 systemd[1]: backuppc.service: Control process
exited, code=exited, status=1/FAILURE

I also tried moving the entire backuppc directory and creating a
symlink between /var/lib/backuppc to /mnt/backups/backuppc but this
also gave the above error.

How do I properly direct the backups to a different location?


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] BSD (FreeNAS) rsync init

2020-11-02 Thread Steve Zemlicka
Thank you for your response Craig.  I took a step back and
reevalutated how I was doing this and I think my issues all were a
result of user accounts and various system level restrictions.  I'm
sure we could've picked our way through that but since this is just a
home lab, I'm being a bit lazy.  After a great deal of user setup/key
copying/visudo modifications/user SID changes/etc, I seemed no closer
to a solution.  I did notice that by default backuppc targets the root
user and, while I understand it's not always best practice to use the
root user since this is a home lab setup, I just went with it.  By
adding the backuppc user key to the root user on the target machine
and adding a line for root to be able to run rsync via visudo, the
backups worked as expected.

On Sun, Nov 1, 2020 at 10:04 PM Craig Barratt via BackupPC-users
 wrote:
>
> Does the error happen at the start of the transfer, or part way through.
>
> It would be helpful to know which version you are running, and the relevant 
> lines from the XferLOG file.
>
>
> Craig
>
> On Sun, Nov 1, 2020 at 1:56 PM Steve Zemlicka  wrote:
>>
>> Update:  I was able to use visudo to add the following line:
>> backuppc ALL=NOPASSWD: /usr/local/bin/rsync
>>
>> I am now able to run "sudo /usr/local/bin/rsync" when SSHed in without
>> being prompted for a password.  I've also manually specified the rsync
>> path of rsync in the RsyncClientCmd but I still get a "connection
>> reset by peer" error when running.
>>
>> On Sun, Nov 1, 2020 at 3:39 PM Steve Zemlicka  
>> wrote:
>> >
>> > I am having trouble getting rsync to initialize without requiring
>> > password.  I have setup ssh passwordless login and verified this is
>> > working from the backuppc server.  However if I then sudo rsync (or
>> > sudo /usr/bin/rsync), I am prompted for a password.  Does anyone have
>> > a link to this specific issue or know off hand how to enable this on
>> > BSD (FreeNAS)?
>>
>>
>> ___
>> BackupPC-users mailing list
>> BackupPC-users@lists.sourceforge.net
>> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
>> Wiki:https://github.com/backuppc/backuppc/wiki
>> Project: https://backuppc.github.io/backuppc/
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:https://github.com/backuppc/backuppc/wiki
> Project: https://backuppc.github.io/backuppc/


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] BSD (FreeNAS) rsync init

2020-11-01 Thread Steve Zemlicka
Update:  I was able to use visudo to add the following line:
backuppc ALL=NOPASSWD: /usr/local/bin/rsync

I am now able to run "sudo /usr/local/bin/rsync" when SSHed in without
being prompted for a password.  I've also manually specified the rsync
path of rsync in the RsyncClientCmd but I still get a "connection
reset by peer" error when running.

On Sun, Nov 1, 2020 at 3:39 PM Steve Zemlicka  wrote:
>
> I am having trouble getting rsync to initialize without requiring
> password.  I have setup ssh passwordless login and verified this is
> working from the backuppc server.  However if I then sudo rsync (or
> sudo /usr/bin/rsync), I am prompted for a password.  Does anyone have
> a link to this specific issue or know off hand how to enable this on
> BSD (FreeNAS)?


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


[BackupPC-users] BSD (FreeNAS) rsync init

2020-11-01 Thread Steve Zemlicka
I am having trouble getting rsync to initialize without requiring
password.  I have setup ssh passwordless login and verified this is
working from the backuppc server.  However if I then sudo rsync (or
sudo /usr/bin/rsync), I am prompted for a password.  Does anyone have
a link to this specific issue or know off hand how to enable this on
BSD (FreeNAS)?


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] Reset web interface password

2019-04-08 Thread Steve
Thanks Craig

Steve

On Sat, 6 Apr 2019 21:47:35 -0700
Craig Barratt via BackupPC-users 
wrote:

> This is enforced by Apache, not BackupPC.  So please look in your
> apache config and see what type of authentication is specified for
> BackupPC's CGI script.  If it's basic authentication, then the
> passwords are set with htpasswd.
> 
> Craig
> 
> On Sat, Apr 6, 2019 at 11:12 AM Steve  wrote:
> 
> > Hi,
> >
> > When I go to http://localhost/Backuppc, it asks me for a username
> > and password. It's been a long time since I set this up and I don't
> > remember which user is the admin. Is is $Conf{CgiAdminUsers}
> > from /etc/backuppc/config.pl?
> >
> > This was set to just backuppc and so I
> > added my user id to the list and restarted backuppc and apache but
> > it still won't take my normal login password. Do I need a seperate
> > password for this?
> >
> > I'm on a Debian based system running v3.3.1.4.
> >
> >  $ dpkg -l | grep -i backuppc
> > ii  backuppc 3.3.1-4  amd64 high-performance, enterprise-grade
> > system for backing up PCs
> >
> > I only found documentation for the v4 codebase.
> >
> > Thanks,
> > Steve
> >
> >
> > ___
> > BackupPC-users mailing list
> > BackupPC-users@lists.sourceforge.net
> > List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> > Wiki:http://backuppc.wiki.sourceforge.net
> > Project: http://backuppc.sourceforge.net/
> >  



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Reset web interface password

2019-04-06 Thread Steve
Hi,

When I go to http://localhost/Backuppc, it asks me for a username and
password. It's been a long time since I set this up and I don't
remember which user is the admin. Is is $Conf{CgiAdminUsers}
from /etc/backuppc/config.pl? 

This was set to just backuppc and so I
added my user id to the list and restarted backuppc and apache but it
still won't take my normal login password. Do I need a seperate
password for this?

I'm on a Debian based system running v3.3.1.4.

 $ dpkg -l | grep -i backuppc
ii  backuppc 3.3.1-4  amd64 high-performance, enterprise-grade system
for backing up PCs

I only found documentation for the v4 codebase.

Thanks,
Steve


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large files with small changes

2018-11-20 Thread Steve Richards
Thanks for confirming, Craig. There are lots of approaches that I could 
use to reduce the duplication, but they would all add complexity 
needlessly if BackupPC was already storing just deltas at a high 
granularity (and I completely understand the decision not to do so).


I'm currently looking at MySQL Incremental Backup (which in turn uses 
AutoMysqlBackup) - I haven't quite got it working yet but it should be 
better than rolling my own.


SteveR.

On 20/11/2018 17:39, Craig Barratt via BackupPC-users wrote:

Steve,

You are exactly right - BackupPC's storage granularity is whole 
files.  So, in the worst case, a single byte change to a file that is 
a unique will result in a new file in the pool.  Rsync will only 
transfer the deltas, but the full file gets rebuilt on the server.


Before I did the 4.x rewrite, I did some benchmarking on block-level 
or more granular deltas, but the typical performance improvement was 
modest and the effort to implement it was large.  However, there are 
two cases where block-level or byte-level deltas would be very helpful 
- database files (as you mentioned) and VM images.


Perhaps you could use $Conf{DumpPreUserCmd} to run a script that 
generates byte-level deltas, and exclude the original database files?  
You could have a weekly schedule where you copy the full database file 
on, eg, Sunday, and generate deltas every other day of the week.  Then 
BackupPC will backup the full file once, and also grab each of the 
deltas.  That way you'll have a complete database file once per week, 
and all the daily (cumulative) deltas.


Craig

On Tue, Nov 20, 2018 at 9:28 AM Steve Richards <mailto:b...@boxersoft.com>> wrote:


Thanks. Yes, I had seen that in the docs but I got the impression
that the deltas referred to there were at the granularity of whole
files. For example, let's say backup 1 contains files A, B and C.
If B is then modified then, during the next backup rsync might
only /transfer/ the deltas needed to change B to B1 e.g. "replace
line 5 with [new content]". I got the impression that those deltas
would be used to create B1 though, and that both complete files (B
and B1) would be stored in the pool as whole files. The deltas
referred to in the docs would then be how to get from one /backup/
to another e.g. "Delete file B, insert file B1" (or vice versa,
depending on whether it's BackupPC V3 or V4).

So that's the way I interpreted it, but I'm very new to this so I
may have got the wrong end of the stick completely. If anyone
could confirm or correct my understanding, I'd appreciate it
either way.

Thanks for the comments on mysqldump, I'll take a look at those
options.

SteveR.

On 20/11/2018 14:05, Mike Hughes wrote:


Hi Steve,

It looks like they are stored using reverse deltas. Maybe you’ve
already seen this from the V4.0 documentation:

  * Backups are stored as "reverse deltas" - the most recent
backup is always filled and older backups are reconstituted
by merging all the deltas starting with the nearest future
filled backup and working backwards.

This is the opposite of V3 where incrementals are stored as
"forward deltas" to a prior backup (typically the last full
backup or prior lower-level incremental backup, or the last full
in the case of rsync).

  * Since the most recent backup is filled, viewing/restoring
that backup (which is the most common backup used) doesn't
require merging any deltas from other backups.
  * The concepts of incr/full backups and unfilled/filled storage
are decoupled. The most recent backup is always filled. By
default, for the remaining backups, full backups are filled
and incremental backups are unfilled, but that is configurable.

Additionally these tips might help apply deltas to the files and
reduce transfer bandwidth:

MySQL dump has an option  ‘--order-by-primary’ which sorts
before/while dumping the database. Useful if you’re trying to
limit the amount to be rsync’ed. You’ll need to evaluate the
usefulness of this based on db design.

If you’re compressing your database look into the “--rsyncable”
option available in the package pigz.

*From:* Steve Richards 
<mailto:b...@boxersoft.com>
*Sent:* Tuesday, November 20, 2018 04:34
*To:* backuppc-users@lists.sourceforge.net
<mailto:backuppc-users@lists.sourceforge.net>
*Subject:* [BackupPC-users] Large files with small changes

I think some backup programs are able to store just the changes
("deltas") in a file when making incrementals. Am I right in
thinking that BackupPC doesn't do this, and would instead store
the whole of each changed file as separate entries in the pool?

Reason for asking is that I want to implement a backup strategy
   

Re: [BackupPC-users] Large files with small changes

2018-11-20 Thread Steve Richards
Thanks. Yes, I had seen that in the docs but I got the impression that 
the deltas referred to there were at the granularity of whole files. For 
example, let's say backup 1 contains files A, B and C. If B is then 
modified then, during the next backup rsync might only /transfer/ the 
deltas needed to change B to B1 e.g. "replace line 5 with [new 
content]". I got the impression that those deltas would be used to 
create B1 though, and that both complete files (B and B1) would be 
stored in the pool as whole files. The deltas referred to in the docs 
would then be how to get from one /backup/ to another e.g. "Delete file 
B, insert file B1" (or vice versa, depending on whether it's BackupPC V3 
or V4).


So that's the way I interpreted it, but I'm very new to this so I may 
have got the wrong end of the stick completely. If anyone could confirm 
or correct my understanding, I'd appreciate it either way.


Thanks for the comments on mysqldump, I'll take a look at those options.

SteveR.

On 20/11/2018 14:05, Mike Hughes wrote:


Hi Steve,

It looks like they are stored using reverse deltas. Maybe you’ve 
already seen this from the V4.0 documentation:


  * Backups are stored as "reverse deltas" - the most recent backup is
always filled and older backups are reconstituted by merging all
the deltas starting with the nearest future filled backup and
working backwards.

This is the opposite of V3 where incrementals are stored as "forward 
deltas" to a prior backup (typically the last full backup or prior 
lower-level incremental backup, or the last full in the case of rsync).


  * Since the most recent backup is filled, viewing/restoring that
backup (which is the most common backup used) doesn't require
merging any deltas from other backups.
  * The concepts of incr/full backups and unfilled/filled storage are
decoupled. The most recent backup is always filled. By default,
for the remaining backups, full backups are filled and incremental
backups are unfilled, but that is configurable.

Additionally these tips might help apply deltas to the files and 
reduce transfer bandwidth:


MySQL dump has an option  ‘--order-by-primary’ which sorts 
before/while dumping the database. Useful if you’re trying to limit 
the amount to be rsync’ed. You’ll need to evaluate the usefulness of 
this based on db design.


If you’re compressing your database look into the “--rsyncable” option 
available in the package pigz.


*From:* Steve Richards 
*Sent:* Tuesday, November 20, 2018 04:34
*To:* backuppc-users@lists.sourceforge.net
*Subject:* [BackupPC-users] Large files with small changes

I think some backup programs are able to store just the changes 
("deltas") in a file when making incrementals. Am I right in thinking 
that BackupPC doesn't do this, and would instead store the whole of 
each changed file as separate entries in the pool?


Reason for asking is that I want to implement a backup strategy for 
databases, which is likely to involve multi-megabyte SQL files that 
differ only slightly from day to day. I'm trying to decide how best to 
handle them.




___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Large files with small changes

2018-11-20 Thread Steve Richards
I think some backup programs are able to store just the changes 
("deltas") in a file when making incrementals. Am I right in thinking 
that BackupPC doesn't do this, and would instead store the whole of each 
changed file as separate entries in the pool?


Reason for asking is that I want to implement a backup strategy for 
databases, which is likely to involve multi-megabyte SQL files that 
differ only slightly from day to day. I'm trying to decide how best to 
handle them.


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Browsing backups: view files directly instead of downloading

2018-11-16 Thread Steve Richards

On 16/11/2018 19:39, B wrote:

you just have to remember 2 things:
* BackupPC is a _backup_ software, not a viewer of any kind,
* Better is the best enemy of good.

When you have a good software that does what it is meant to do and have
no bugs, the beginning of lots of troubles is when you decide to
"improve" it with useless "functionalities".


I don't disagree with any of that, but neither do I think it really 
addresses the point.


As I understand it web servers are typically configured to serve 
responses with headers that describe the content's MIME type, allowing 
the browser to decide how to present it (render in a tab, launch an 
application, offer a File Save dialog). That's not useless funtionality, 
it's a standard feature of web servers - without it you would never have 
web pages rendered in your browser, you'd have to download the pages and 
then open them.


The line I quoted from the documentation implies (to me, at least) that 
this is the expected behaviour when clicking on a link while browsing 
backups. I'm not asking for it to be added as a feature, I'm asking:


1) Am I correct in thinking that this is what the documentation means?
2) If yes, can anyone suggest where I should look to see what I have 
configured wrongly?

3) If no, what does the documentation mean?


You have just been excommunicated by the Apache foundation.


The machine running BackupPC has other stuff that is built on Nginx, and 
I really don't want a second web server on it. I run Apache everywhere 
else, so the Foundation will probably cut me a little slack.


SteveR.
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Browsing backups: view files directly instead of downloading

2018-11-16 Thread Steve Richards
I'm a BackupPC newbie but I seem to have things up and running nicely. 
Seems a very impressive tool. One thing that doesn't seem to work for me 
as described in the documentation is the handling of files when browsing 
backups. The docs say:


Your browser should prompt you with the filename and ask you whether to 
open the file or save it to disk.


I took that to mean that I would have the option to view the contents of 
the file, either directly in the browser for content it can render 
(text, pdf etc.) or by opening the application associated with the 
relevant MIME type. I don't get that option though, I just get a File 
Save dialog box. That allows me to save the file locally, after which I 
can successfully open it. For those times when you just want to take a 
peep at a previous version though, it's not quite as convenient as 
opening it directly.


My uses Nginx rather than Apache (because the machine already runs 
Nginx). Could that be source of the glitch, or have I perhaps 
misunderstood how it's supposed to work?


SteveR.


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem trying to restore to remote host

2018-09-21 Thread Steve Palm
On Sep 20, 2018, at 3:13 AM, Craig Barratt via BackupPC-users 
mailto:backuppc-users@lists.sourceforge.net>> wrote:
> What version of BackupPC are you using?

I am on 4.2.1 with the latest versions of the other modules.

> It looks like you are trying to restore /Users/dickk/Desktop and 
> /Users/dickk/Documents to another share (/Shared Items/Net Programs) and path 
> (/Ministry/DickK_Restore/).

 That is correct, it is a good summary of what I'm trying to do.

> The "Trimming /dickk from filesList" looks correct; it's the common path in 
> the files you want to restore, so it gets stripped from the file list, and 
> added to the rsync_bpc arguments.  Yes, rsync_bpc is expecting to chdir to 
> /dickk (below host dickk, share /Users), and we need to figure out why it 
> can't do that.
> 
> Does this directory exist: /mnt/stage/BackupPC/pc/dickk/1069/fUsers/fdickk?

This is what the path looks like:

/mnt/stage/BackupPC/pc/dickk/1069/f%2fUsers%2f/fdickk

I'm not sure where the %2f comes from, is that a / character?

Looking at the config for that host, the RsyncShareName is: /Users/

For what it's worth, I was able to do a tar restore of the same folders, it 
just won't restore to the other host. The user's computer no longer exists, so 
I can't restore it back to that computer.

Thanks,
 Steve


> On Wednesday, September 19, 2018, Steve Palm  <mailto:n9...@n9yty.com>> wrote:
> Can anyone help me determine what the problem is here?  Trying to restore 
> from a backup onto a different host. The original backup was for computer 
> 'dickk' and trying to restore it to 'xserve1':
> 
> I inserted blank lines for readability, but everything looks OK except for 
> the 
> 
> Trimming /dickk from filesList
> 
> Wrote source file list to 
> /mnt/stage/BackupPC/pc/xserve1/.rsyncFilesFrom23795: /Desktop /Documents
> 
> Running: /usr/local/bin/rsync_bpc --bpc-top-dir /mnt/stage/BackupPC 
> --bpc-host-name dickk --bpc-share-name /Users/ --bpc-bkup-num 1069 
> --bpc-bkup-comp 1 --bpc-bkup-merge 1069/1/3 --bpc-attrib-new --bpc-log-level 
> 1 -e /usr/bin/ssh\ -T\ -q\ -x\ -l\ backuppc --rsync-path=nice\ -n\ 15\ sudo\ 
> /usr/local/bin/rsync --recursive --super --numeric-ids --perms --owner 
> --group -D --times --links --hard-links --delete --partial 
> --log-file-format=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ %f%L --stats --protect-args 
> --files-from=/mnt/stage/BackupPC/pc/xserve1/.rsyncFilesFrom23795 /dickk 
> xserve1.sga.org <http://xserve1.sga.org/>:/Shared\ Items/Net\ 
> Programs/Ministry/DickK_Restore/
> 
> This is the rsync child about to exec /usr/local/bin/rsync_bpc
> 
> rsync_bpc: change_dir "/dickk" failed: No such file or directory (2)
> 
> Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 
> sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 0 inode
> 
> rsync error: errors selecting input/output files, dirs (code 3) at 
> flist.c(2022) [sender=3.0.9.12]
> 
> rsync_bpc exited with fatal status 3 (768) (rsync error: errors selecting 
> input/output files, dirs (code 3) at flist.c(2022) [sender=3.0.9.12])
> 
> restore failed: rsync error: errors selecting input/output files, dirs (code 
> 3) at flist.c(2022) [sender=3.0.9.12]
> 
> I don't understand the failure to change_dir to "/dickk" ?  The pc dir does 
> exist on the stage volume '/mnt/stage/BackupPC/pc/dickk/'.
> 
> The RestoreInfo file in the destination pc directory has:
> 
> %RestoreReq = (
>   'fileList' => [
> '/dickk/Desktop',
> '/dickk/Documents'
>   ],
>   'shareDest' => '/Shared Items/Net Programs',
>   'pathHdrDest' => '/Ministry/DickK_Restore/',
>   'num' => '1069',
>   'reqTime' => 1537371681,
>   'shareSrc' => '/Users/',
>   'pathHdrSrc' => '/dickk',
>   'hostDest' => 'xserve1',
>   'hostSrc' => 'dickk',
>   'user' => 'admin'
> );
> 
> Can anyone help me find out what is going on?  Thanks!
> 
> Steve
> 
> 
> 
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Problem trying to restore to remote host

2018-09-19 Thread Steve Palm
Can anyone help me determine what the problem is here?  Trying to restore from 
a backup onto a different host. The original backup was for computer 'dickk' 
and trying to restore it to 'xserve1':

I inserted blank lines for readability, but everything looks OK except for the 

Trimming /dickk from filesList

Wrote source file list to /mnt/stage/BackupPC/pc/xserve1/.rsyncFilesFrom23795: 
/Desktop /Documents

Running: /usr/local/bin/rsync_bpc --bpc-top-dir /mnt/stage/BackupPC 
--bpc-host-name dickk --bpc-share-name /Users/ --bpc-bkup-num 1069 
--bpc-bkup-comp 1 --bpc-bkup-merge 1069/1/3 --bpc-attrib-new --bpc-log-level 1 
-e /usr/bin/ssh\ -T\ -q\ -x\ -l\ backuppc --rsync-path=nice\ -n\ 15\ sudo\ 
/usr/local/bin/rsync --recursive --super --numeric-ids --perms --owner --group 
-D --times --links --hard-links --delete --partial --log-file-format=log:\ %o\ 
%i\ %B\ %8U,%8G\ %9l\ %f%L --stats --protect-args 
--files-from=/mnt/stage/BackupPC/pc/xserve1/.rsyncFilesFrom23795 /dickk 
xserve1.sga.org:/Shared\ Items/Net\ Programs/Ministry/DickK_Restore/

This is the rsync child about to exec /usr/local/bin/rsync_bpc

rsync_bpc: change_dir "/dickk" failed: No such file or directory (2)

Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 
sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 0 inode

rsync error: errors selecting input/output files, dirs (code 3) at 
flist.c(2022) [sender=3.0.9.12]

rsync_bpc exited with fatal status 3 (768) (rsync error: errors selecting 
input/output files, dirs (code 3) at flist.c(2022) [sender=3.0.9.12])

restore failed: rsync error: errors selecting input/output files, dirs (code 3) 
at flist.c(2022) [sender=3.0.9.12]

I don't understand the failure to change_dir to "/dickk" ?  The pc dir does 
exist on the stage volume '/mnt/stage/BackupPC/pc/dickk/'.

The RestoreInfo file in the destination pc directory has:

%RestoreReq = (
  'fileList' => [
'/dickk/Desktop',
'/dickk/Documents'
  ],
  'shareDest' => '/Shared Items/Net Programs',
  'pathHdrDest' => '/Ministry/DickK_Restore/',
  'num' => '1069',
  'reqTime' => 1537371681,
  'shareSrc' => '/Users/',
  'pathHdrSrc' => '/dickk',
  'hostDest' => 'xserve1',
  'hostSrc' => 'dickk',
  'user' => 'admin'
);

Can anyone help me find out what is going on?  Thanks!

Steve



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.2.0 released

2018-05-03 Thread Steve Palm
I think that would cover it here, as you said, if you give someone Admin 
rights, then they can alter any other settings. Only a 
compile-build-install-time option to totally remove it would eliminate this 
possibility.

It is a great feature to have, especially with some restrictions on 
availability. Thanks!

> On Apr 21, 2018, at 7:43 PM, Craig Barratt via BackupPC-users 
> <backuppc-users@lists.sourceforge.net> wrote:
> 
> I just pushed some changes 
> <https://github.com/backuppc/backuppc/commit/5ed68f32f7df4869eb21ba015dce3ed34b08d8d8>
>  that add a new config variable CgiUserDeleteBackupEnable (default off) which 
> sets whether users can delete backups via the CGI interface.  Admins always 
> have the delete feature enabled.
> 
> Craig
> 
> On Fri, Apr 20, 2018 at 11:05 AM, Craig Barratt 
> <cbarr...@users.sourceforge.net <mailto:cbarr...@users.sourceforge.net>> 
> wrote:
> This is a very good point.
> 
> How about I add a configuration setting that has three values - completely 
> off, admin only, or any user?  The default setting could be admin only.
> 
> However, if it's turned off, any admin could change that setting back to 
> admin only.
> 
> Craig
> 
> On Monday, April 16, 2018, Steve Palm <n9...@n9yty.com 
> <mailto:n9...@n9yty.com>> wrote:
> 
> On Apr 16, 2018, at 7:47 AM, Ghislain Adnet <gad...@aqueos.com 
> <mailto:gad...@aqueos.com>> wrote:
> > Le 15/04/2018 à 01:10, Craig Barratt via BackupPC-users a écrit :
> >> BackupPC 4.2.0 <https://github.com/backuppc/backuppc/releases/tag/4.2.0 
> >> <https://github.com/backuppc/backuppc/releases/tag/4.2.0>> has been 
> >> released on Github.
> >> The changes since4.1.5 
> >> <https://github.com/backuppc/backuppc/releases/tag/4.1.5 
> >> <https://github.com/backuppc/backuppc/releases/tag/4.1.5>>are listed 
> >> below.  The biggest change is a new feature in the web interface written 
> >> by @moisseev that allows prior backups to be deleted.
> > 
> > ohhh this is a very bad idea... Having a way to remove backup in the web 
> > interface  sounds cool but when a bad apple employee comes and destroy all 
> > the backups because he is angry this is a real issue. Same if account is 
> > comprimised
>  .
>  .
>  .
> >  is there a way to remove the feature so its not even loaded in the code 
> > (not just limited by the login/pass used) ?
> 
>  I didn't see where it was even configurable by user/login/etc...  If it is, 
> please post, and also a global "shutoff" would be great. Maybe a 
> compile/install option to not even include it as requested above, although 
> for our use case I don't think we need to go that far, hope I'm not ever 
> proven wrong on that. :)
> 
>  Thanks!
>  Steve
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot 
> <http://sdm.link/slashdot>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! 
> http://sdm.link/slashdot___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.2.0 released

2018-04-16 Thread Steve Palm

On Apr 16, 2018, at 7:47 AM, Ghislain Adnet <gad...@aqueos.com> wrote:
> Le 15/04/2018 à 01:10, Craig Barratt via BackupPC-users a écrit :
>> BackupPC 4.2.0 <https://github.com/backuppc/backuppc/releases/tag/4.2.0> has 
>> been released on Github.
>> The changes since4.1.5 
>> <https://github.com/backuppc/backuppc/releases/tag/4.1.5>are listed below.  
>> The biggest change is a new feature in the web interface written by 
>> @moisseev that allows prior backups to be deleted.
> 
> ohhh this is a very bad idea... Having a way to remove backup in the web 
> interface  sounds cool but when a bad apple employee comes and destroy all 
> the backups because he is angry this is a real issue. Same if account is 
> comprimised
 .
 .
 .
>  is there a way to remove the feature so its not even loaded in the code (not 
> just limited by the login/pass used) ?

 I didn't see where it was even configurable by user/login/etc...  If it is, 
please post, and also a global "shutoff" would be great. Maybe a 
compile/install option to not even include it as requested above, although for 
our use case I don't think we need to go that far, hope I'm not ever proven 
wrong on that. :)

 Thanks!
 Steve


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] offsite server

2018-04-16 Thread Steve Palm

On Apr 10, 2018, at 2:02 PM, Johan Ehnberg <jo...@molnix.com> wrote:
> Yes, this is quite feasible and straightforward to set up as long as you have 
> control over the selection of filesystem.
> 
> Using a Copy-on-Write filesystem with snapshots allows you to very 
> efficiently replicate backups from the main server offsite. No rsync needed, 
> just 'zfs send' or equivalent tool for the selected filesystem. It will not 
> have to search for and detect the differences between the repositories like 
> rsync, instead it transfers the incremental changes from the filesystem since 
> last snapshot. Since the filesystem already keeps track of these, the effort 
> is minimal. You also have more control over the transfer data stream than 
> with rsync (read: multithreaded high compression ratio algos, use accelerated 
> VPN instead of SSH etc.).

 This seems like a possible solution to make a mirror disk set to put off set 
with rotation, wouldn't it?
 
 I've been thinking about this for a while, but have never had the time to 
deeply dig into tor test anything... Our system was set up with an XFS 
filesystem for the pool storage based on recommendations at the time, and have 
been thinking of how to dump that to a set of disks that could be taken offsite 
and used to rebuild a recovery server if needed.  Our original intention was to 
back up the pool with bacula or something to tape, but with v3 that wasn't 
workable, and don't know which is the better route to go now, but seems that 
the set of disks is not a bad idea with tape possibly another level, although 
as the pool size grows so does the tape requirement level... :)

 Thanks for the ideas.

 Steve
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No restore checkboxes

2018-04-14 Thread Steve
On Sat, 14 Apr 2018 18:30:37 +0200
B <lazyvi...@gmx.com> wrote:

> On Sat, 14 Apr 2018 11:24:14 -0400
> Steve <zep...@cfl.rr.com> wrote:
> 
> …
> > The problem I have is that I cannot restore. When I look at old
> > backups there are no checkboxes to check to select something to
> > restore.   
> 
> Debian and it's derivates are centering _all_ web operations against
> the www-data group, so, to avoid useless contortions, I changed the
> group of the whole BPC repo:
>   chgrp -R www-data /BPC/BACKUPS
> and the rights of it's directories to 6750 (SUIDs owner & grp):
>   find /BPC -type d -print0 | xargs -0 chmod 6750
> 
> on an existing repo, as you may already know, you can prepare a pack
> of coffee, another of cigars, some pizzas, many beers, a bunch of
> films, a pillow and a jar of rollmops the time it's achieved…
> 
> I use fcgiwrap with nginx and it's working ferpectly :)
> 
> Jean-Yves

Turns out it was the browser. I was using a browser called Web that
comes with Devuan and the checkboxes don't shown up but when I switched
to Firefox they were there.


Steve

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] No restore checkboxes

2018-04-14 Thread Steve
I recently had a hard drive die so I took the opportunity to change my
machine from CentOS 6 to Devuan. It is using Version 3.3.1 of Backuppc.
I set up Apache and Backuppc on the new machine and all appears OK. I
can browse the old backups and I can create new one.

The problem I have is that I cannot restore. When I look at old backups
there are no checkboxes to check to select something to restore. 

I'm guessing it is some kind of permission issue.
Any ideas?

Thanks,
Steve

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] CGI error after upgrade to 4.1.3

2017-10-18 Thread Steve Walker
I have two backuppc servers, newly built on Ubuntu 16.04 from repository 
(BackupPC version 3.3.1-2ubuntu3.3). As I ran into the samba problem where I 
would get a false error upon successful completion of a backup of a Windows 
client, I decided to upgrade to the latest BackupPC version assuming that 
problem would be fixed. I only upgraded the second server, backuppc2, so far. 
There were no errors during the upgrade.

Initially the service wouldn't start but I realized I needed to copy the 
appropriate startup script, which I have done. The backuppc.service now starts 
fine. The machine name in question is backuppc2 and this is the error I receive 
when connecting via a web browser:

This CGI script (/backuppc/index.cgi) is unable to connect to the BackupPC 
server on backuppc2 port -1.
The error was: unix connect: No such file or directory.
Perhaps the BackupPC server is not running or there is a configuration error. 
Please report this to your Sys Admin.

I believe everything in config.pl is correct, and the data location hasn't 
changed from when I was running the 3.3.1 version. It is a 6TB drive mounted on 
/var/lib/backuppc.

Steve

--
[cid:Sig-ROMarket-Dallas-09-11-17_4e79d730-cdd8-4e6c-a158-7afb0f6b1f03.png]
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync error: unexplained error (code 255) at io.c(226) [Receiver=3.1.2.0]

2017-05-09 Thread Steve Palm
You are right, of course... Seems the -T parameter to ssh went away at some 
point from the master rsync transfer configuration, and the ssh authorized_keys 
entries for the backup login were created with no-pty.

Thankfully, at this point, it seems to be running very smoothly. :)

Steve

> On May 7, 2017, at 3:17 AM, Craig Barratt <cbarr...@users.sourceforge.net 
> <mailto:cbarr...@users.sourceforge.net>> wrote:
> 
> Steve,
> 
> Most likely it's a problem with ssh (perhaps it's prompting for a password) 
> or the client shell is producing output before rsync is run.
> 
> Craig
> 
> On Wednesday, May 3, 2017, Steve Palm <n9...@n9yty.com 
> <mailto:n9...@n9yty.com>> wrote:
> Any clue what this is saying?
> 
> 
> This is the rsync child about to exec /usr/local/bin/rsync_bpc
> rsync_bpc: connection unexpectedly closed (0 bytes received so far) [Receiver]
> Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 
> sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 1676476 inode
> rsync error: unexplained error (code 255) at io.c(226) [Receiver=3.1.2.0]
> rsync_bpc exited with fatal status 255 (65280) (rsync error: unexplained 
> error (code 255) at io.c(226) [Receiver=3.1.2.0])
> Xfer PIDs are now
> Got fatal error during xfer (No files dumped for share /Users/)
> Backup aborted (No files dumped for share /Users/)
> 
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot <http://sdm.link/slashdot>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net <javascript:;>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot___ 
> <http://sdm.link/slashdot___>
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync error: unexplained error (code 255) at io.c(226) [Receiver=3.1.2.0]

2017-05-03 Thread Steve Palm
Any clue what this is saying?


This is the rsync child about to exec /usr/local/bin/rsync_bpc
rsync_bpc: connection unexpectedly closed (0 bytes received so far) [Receiver]
Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 
sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 1676476 inode
rsync error: unexplained error (code 255) at io.c(226) [Receiver=3.1.2.0]
rsync_bpc exited with fatal status 255 (65280) (rsync error: unexplained error 
(code 255) at io.c(226) [Receiver=3.1.2.0])
Xfer PIDs are now 
Got fatal error during xfer (No files dumped for share /Users/)
Backup aborted (No files dumped for share /Users/)



--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Help to understand some XFerLOG messages

2017-05-03 Thread Steve Palm
79 sec)
Xfer PIDs are now 


So, seeing that on subsequent runs the first share points all back up properly 
now, just the IMAP is having problems, and that error message about destination 
must be a directory is consistent.

Should I just manually delete all the backups for this host and start over? 
Would that help?  IF SO: What is the best way to do that?

Thanks, if you got this far!  :)

 Steve


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync: –protect-args on Mac OS X ?

2017-04-28 Thread Steve Palm
For others, it seemed to work with using '/Path With/Spaces in it/' in the 
share name.  However, realizing how out of date the rsync binary is on Mac OS 
X, I decided to install a newer version that is, from what I read, far more 
memory efficient and utilizes the newer rsync protocol to be more efficient on 
transfers. Installed that in /usr/local/bin/rsync and modified the xfer jobs of 
those machines to use that path instead.  Now I can use the --protect-args (-s) 
parameter.

 Steve

> On Apr 23, 2017, at 10:06 AM, Steve Palm <n9...@n9yty.com> wrote:
> 
> Hi,
> 
> Under v3 I didn't have any problems with shares that had spaces in the names. 
> However, I see that after all was converted to v4 rsync was using a  
> –protect-args  parameter.  However, rsync on Mac OS X does not support that 
> parameter, so how to restore the ability to get paths with spaces in them?
> 
> I am going to try using ' and \, like '/Shared\ Items/Shared\ Volume/Name\ 1' 
> and see if it works, but if there is a better way, or if that doesn't work, 
> what is the "right" way?  And how did it work in v3? :)
> 
> Steve
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync: –protect-args on Mac OS X ?

2017-04-23 Thread Steve Palm
Hi,

Under v3 I didn't have any problems with shares that had spaces in the names. 
However, I see that after all was converted to v4 rsync was using a  
–protect-args  parameter.  However, rsync on Mac OS X does not support that 
parameter, so how to restore the ability to get paths with spaces in them?

I am going to try using ' and \, like '/Shared\ Items/Shared\ Volume/Name\ 1' 
and see if it works, but if there is a better way, or if that doesn't work, 
what is the "right" way?  And how did it work in v3? :)

 Steve
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot create /var/run/BackupPC ??

2017-04-12 Thread Steve Palm
Ah, I didn't update to 4.1.1, especially since If found the problem I didn't 
want to change anything until I had it resolved. LOL

I also saw reference to creating a file in /etc/tmpfiles.d/BackupPC.conf 
containing one line:
D /var/run/BackupPC 0775 root backuppc

This supposedly creates the file at a more "system" level before the script 
runs. They suggested doing this instead of having your startup script create 
it, but I don't know which is better.

Thanks for the reply!
 Steve


> On Apr 12, 2017, at 3:22 PM, Craig Barratt <cbarr...@users.sourceforge.net 
> <mailto:cbarr...@users.sourceforge.net>> wrote:
> 
> Steve,
> 
> In 4.1.1 I added this directive to the systemd backuppc.service file:
> 
> RuntimeDirectory=BackupPC
> 
> which creates the /var/run/BackupPC directory just before it starts BackupPC.
> 
> Craig
> 
> On Wed, Apr 12, 2017 at 12:45 PM, Steve Palm <n9...@n9yty.com 
> <mailto:n9...@n9yty.com>> wrote:
> Looked at the code, ${RunDir} is the location, and its s a directory not a 
> file.
> 
> Not sure why it couldn't be created, but I did so manually, changed ownership 
> to backuppc and it works.
> 
> Still don't know why it went wrong...
> 
> Steve
> 
> > On Apr 12, 2017, at 2:35 PM, Steve Palm <n9...@n9yty.com 
> > <mailto:n9...@n9yty.com>> wrote:
> >
> > Hello,
> >
> > I had a hard server crash, seems a RAM module went bad. After diagnosing 
> > and getting it back up and running, BackupPC will no longer start. I simply 
> > get:
> >
> > systemd[1]: Starting SYSV: Starts and stops the BackupPC server...
> > runuser[6605]: pam_unix(runuser:session): session opened for user backuppc 
> > by (uid=0)
> > backuppc[6604]: Starting BackupPC: 2017-04-12 14:32:20 Can't create 
> > /var/run/BackupPC... quitting
> > runuser[6605]: pam_unix(runuser:session): session closed for user backuppc
> > backuppc[6604]: [FAILED]
> > systemd[1]: backuppc.service: control process exited, code=exited status=1
> > systemd[1]: Failed to start SYSV: Starts and stops the BackupPC server.
> > systemd[1]: Unit backuppc.service entered failed state.
> > systemd[1]: backuppc.service failed.
> >
> > I am puzzled because I hadn't changed anything configuration-wise, system 
> > or BackupPC, so I don't know why it won't start all of a sudden.
> >
> > I can manually (as root) touch /var/run/BackupPC without issue, so I am 
> > perplexed as to where to look from here.
> >
> > Any ideas?
> > --
> > Check out the vibrant tech community on one of the world's most
> > engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> > http://sdm.link/slashdot <http://sdm.link/slashdot>
> > ___
> > BackupPC-users mailing list
> > BackupPC-users@lists.sourceforge.net 
> > <mailto:BackupPC-users@lists.sourceforge.net>
> > List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> > <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> > Wiki:http://backuppc.wiki.sourceforge.net 
> > <http://backuppc.wiki.sourceforge.net/>
> > Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot <http://sdm.link/slashdot>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot___ 
> <http://sdm.link/slashdot___>
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backu

Re: [BackupPC-users] Cannot create /var/run/BackupPC ??

2017-04-12 Thread Steve Palm
Looked at the code, ${RunDir} is the location, and its s a directory not a file.

Not sure why it couldn't be created, but I did so manually, changed ownership 
to backuppc and it works.

Still don't know why it went wrong...

Steve

> On Apr 12, 2017, at 2:35 PM, Steve Palm <n9...@n9yty.com> wrote:
> 
> Hello,
> 
> I had a hard server crash, seems a RAM module went bad. After diagnosing and 
> getting it back up and running, BackupPC will no longer start. I simply get:
> 
> systemd[1]: Starting SYSV: Starts and stops the BackupPC server...
> runuser[6605]: pam_unix(runuser:session): session opened for user backuppc by 
> (uid=0)
> backuppc[6604]: Starting BackupPC: 2017-04-12 14:32:20 Can't create 
> /var/run/BackupPC... quitting
> runuser[6605]: pam_unix(runuser:session): session closed for user backuppc
> backuppc[6604]: [FAILED]
> systemd[1]: backuppc.service: control process exited, code=exited status=1
> systemd[1]: Failed to start SYSV: Starts and stops the BackupPC server.
> systemd[1]: Unit backuppc.service entered failed state.
> systemd[1]: backuppc.service failed.
> 
> I am puzzled because I hadn't changed anything configuration-wise, system or 
> BackupPC, so I don't know why it won't start all of a sudden.
> 
> I can manually (as root) touch /var/run/BackupPC without issue, so I am 
> perplexed as to where to look from here.
> 
> Any ideas?
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Cannot create /var/run/BackupPC ??

2017-04-12 Thread Steve Palm
Hello,

I had a hard server crash, seems a RAM module went bad. After diagnosing and 
getting it back up and running, BackupPC will no longer start. I simply get:

systemd[1]: Starting SYSV: Starts and stops the BackupPC server...
runuser[6605]: pam_unix(runuser:session): session opened for user backuppc by 
(uid=0)
backuppc[6604]: Starting BackupPC: 2017-04-12 14:32:20 Can't create 
/var/run/BackupPC... quitting
runuser[6605]: pam_unix(runuser:session): session closed for user backuppc
backuppc[6604]: [FAILED]
systemd[1]: backuppc.service: control process exited, code=exited status=1
systemd[1]: Failed to start SYSV: Starts and stops the BackupPC server.
systemd[1]: Unit backuppc.service entered failed state.
systemd[1]: backuppc.service failed.

I am puzzled because I hadn't changed anything configuration-wise, system or 
BackupPC, so I don't know why it won't start all of a sudden.

I can manually (as root) touch /var/run/BackupPC without issue, so I am 
perplexed as to where to look from here.

Any ideas?
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync_bpc benign status 24

2017-03-28 Thread Steve Palm
I saw another backup with the status 24, and yes, they did get marked OK. I 
didn't cancel this one, I saw it in the list of backups needing attention. 
Anyway, I had restarted it and it completed. If I see it again, I will capture 
and send along more complete logs. Sorry for not being more comprehensive in 
the first place.

Steve

> On Mar 27, 2017, at 8:12 PM, Craig Barratt <cbarr...@users.sourceforge.net 
> <mailto:cbarr...@users.sourceforge.net>> wrote:
> 
> Steve,
> 
> Yes, if that's the only error, then the backup should be considered ok.  The 
> log you pasted does say "Backup aborted by user signal", which is likely why 
> it exited with a fatal error.  Did you cancel or abort the backup?
> 
> It would be helpful to see more of the log file, specifically all of the last 
> hundred or so lines (just email straight to me is fine).
> 
> Craig
> 
> On Sat, Mar 25, 2017 at 5:59 PM, Steve Palm <n9...@n9yty.com 
> <mailto:n9...@n9yty.com>> wrote:
> This could potentially be an issue...
> 
> Backing up a server, I get:
> 
> rsync warning: some files vanished before they could be transferred (code 24) 
> at main.c(1543) [generator=3.0.9.5]
> rsync_bpc exited with benign status 24 (6144)
> Got fatal error during xfer (rsync_bpc exited with benign status 24 (6144))
> Backup aborted by user signal
> 
> On some of these, there will ALWAYS be files that will vanish before they 
> could be transferred. However, because of this "benign status 24", the backup 
> is left marked as partial.
> 
> -Steve
> 
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot <http://sdm.link/slashdot>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot___ 
> <http://sdm.link/slashdot___>
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync_bpc benign status 24

2017-03-25 Thread Steve Palm
This could potentially be an issue...

Backing up a server, I get:

rsync warning: some files vanished before they could be transferred (code 24) 
at main.c(1543) [generator=3.0.9.5]
rsync_bpc exited with benign status 24 (6144)
Got fatal error during xfer (rsync_bpc exited with benign status 24 (6144))
Backup aborted by user signal

On some of these, there will ALWAYS be files that will vanish before they could 
be transferred. However, because of this "benign status 24", the backup is left 
marked as partial.

-Steve



--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC v4 -- pool backups easier?

2017-03-23 Thread Steve Palm

On Mar 23, 2017, at 11:22 AM, Craig Barratt <cbarr...@users.sourceforge.net> 
wrote:
> Yes, that's one of the major advantages of 4.x - the pool is now much easier 
> to copy or backup.

 HUGE HURRAH!  :)

> In 4.1.0 there is a new utility BackupPC_migrateV3toV4 that you can 
> optionally use to bulk upgrade old 3.x backups to v4 format.  Although that 
> might take a while, it allows you to eliminate all the old hardlinked backups.

 This is also great for a few systems that I have archival backups there, which 
have not been moved to other media yet, so I can get everything to v4 format.
 
 AWESOME WORK!  Can't thank you enough for your dedication to the project.

 Steve


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC v4 -- pool backups easier?

2017-03-23 Thread Steve Palm
Hi,

Simple question, I hope, but seeing the substantial differences "under the 
hood" on how BackupPC is handling things, no more hard links in particular, 
will this make it easier to backup the pool to other systems or tape?  It 
seemed to me that one of the sticking points has always been the huge number of 
hard links.

I have tried a number of things, I think my last serious effort was suing an 
xfsdump, but the restore time was on the order of many days... :(

This would be a nice improvement. :)

Steve
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No files dumped?

2017-03-23 Thread Steve Palm
Well, I have one host backing up, finally But I have to revisit the 
settings and see how to propagate them to the others in the right way.

The essence, though, per my previous config, was that the RsyncSshPath was:
$sshPath -q -x -l backuppc -c arcfour

The RsyncClientPath was: nice -n 19 sudo /usr/bin/rsync

Yes, I should specify the path to nice, but also, yes, it is different on some 
different hosts. :(

Just saying, I'd upvote an upgrade feature that didn't break the client 
configs. :) :)  But since they are numerous and multivariate, that probably 
isn't possible. Maybe at least mention in the upgrade notes that you probably 
are going to have to revisit all your client configurations because they are 
all changed. If it was there, I didn't see it.

Don't want to sound like I'm complaining, just mentioning that it would be 
nice. I am very thankful for the upgrade, and now that I think I have a handle 
on how to reconfigure the hosts it will be interesting to see how performance 
has improved.

Thanks,
 Steve


> On Mar 23, 2017, at 8:45 AM, Steve Palm <n9...@n9yty.com> wrote:
> 
> I'm seeing this on all the hosts so far, the first batch have all been Linux 
> servers using an rsync backup method over ssh...
> 
> My understanding of what the logs are telling me is that the connection 
> closed. Maybe never really opened. I do see that my original ssh command 
> configuration was lost during the upgrade, and looking at the config now I am 
> not entirely sure how to put it back...
> 
> My previous config used these:
> 
> $Conf{RsyncClientCmd} = '$sshPath -q -x -l backuppc -c arcfour $host nice -n 
> 19 sudo $rsyncPath $argList+';
> 
> $Conf{RsyncClientRestoreCmd} = '$sshPath -q -x -l backuppc -c arcfour $host 
> nice -n 19 sudo $rsyncPath $argList+';
> 
> Not sure how to get those back into the new config.  Do they all go on one 
> line on the RsyncSshArgs?  But clearly not all of them anymore, so what is 
> still relevant?  Or do they go as separate items in that list?  I tried both, 
> but they don't seem to work either.
> 
> The problem is, I think, I haven't touched the old configs for so long I 
> can't remember all of how that was even set up, and I'm not making the jump 
> to align with the new changes.
> 
> Suggestions?  Thanks!
> 
> Steve
> 
> 
> More information below, from the initial post-upgrade configs before I 
> started trying to change RsyncSshArgs:
> 
> LOG:
> 2017-03-21 23:41:10 removing incr backup 2165
> 2017-03-23 01:52:12 Created directory /mnt/stage/BackupPC/pc/miscnet/refCnt
> 2017-03-23 01:52:13 Copying v3 backup #2172 to v4 #2173
> 2017-03-23 02:58:25 BackupPC_refCountUpdate: doing fsck on misspent #2173 
> since there are no poolCnt files
> 2017-03-23 03:00:43 BackupPC_refCountUpdate: host miscnet got 0 errors (took 
> 138 secs)
> 2017-03-23 03:00:43 BackupPC_backupDuplicate: got 0 errors and 0 file open 
> errors
> 2017-03-23 03:00:43 full backup started for directory /
> 2017-03-23 03:00:44 Got fatal error during xfer (No files dumped for share /)
> 2017-03-23 03:00:49 Backup aborted (No files dumped for share /)
> 
> XferLog:
> XferLOG file /mnt/stage/BackupPC/pc/miscnet/XferLOG.2173.z created 2017-03-23 
> 01:52:13 
> Backup prep: type = full, case = 2, inPlace = 1, doDuplicate = 1, newBkupNum 
> = 2173, newBkupIdx = 8, lastBkupNum = , lastBkupIdx =  (FillCycle = 0, 
> noFillCnt = 4)
> Executing /usr/local/BackupPC/bin/BackupPC_backupDuplicate -h miscnet
> Xfer PIDs are now 7131
> Copying v3 backup #2172 to v4 #2173
> Xfer PIDs are now 7131,7795
> BackupPC_refCountUpdate: doing fsck on miscnet #2173 since there are no 
> poolCnt files
> BackupPC_refCountUpdate: host miscnet got 0 errors (took 138 secs)
> Xfer PIDs are now 7131
> BackupPC_backupDuplicate: got 0 errors and 0 file open errors
> Finished BackupPC_backupDuplicate (running time: 4110 sec)
> Running: /usr/local/bin/rsync_bpc --bpc-top-dir /mnt/stage/BackupPC 
> --bpc-host-name miscnet --bpc-share-name / --bpc-bkup-num 2173 
> --bpc-bkup-comp 1 --bpc-bkup-prevnum -1 --bpc-bkup-prevcomp -1 
> --bpc-bkup-inode0 51267 --bpc-attrib-new --bpc-log-level 1 -e /usr/bin/ssh\ 
> -q\ -x\ -l\ backuppc --rsync-path=/usr/bin/rsync --super --recursive 
> --protect-args --numeric-ids --perms --owner --group -D --times --links 
> --hard-links --delete --partial --log-format=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ 
> %f%L --stats --checksum --one-file-system --iconv=utf8,utf8 --exclude=/proc 
> --exclude=/nfs miscnet.sga.org:/ /
> full backup started for directory /
> Xfer PIDs are now 7828
> This is the rsync child about to exec /usr/local/bin/rsync_bpc
> rsync_bpc: connection unexpectedly closed (0 bytes received so far) [Receiver]
> Done: 0 errors, 0 filesExist, 0 sizeExist, 0 si

[BackupPC-users] No files dumped?

2017-03-23 Thread Steve Palm
I'm seeing this on all the hosts so far, the first batch have all been Linux 
servers using an rsync backup method over ssh...

My understanding of what the logs are telling me is that the connection closed. 
Maybe never really opened. I do see that my original ssh command configuration 
was lost during the upgrade, and looking at the config now I am not entirely 
sure how to put it back...

My previous config used these:

$Conf{RsyncClientCmd} = '$sshPath -q -x -l backuppc -c arcfour $host nice -n 19 
sudo $rsyncPath $argList+';

$Conf{RsyncClientRestoreCmd} = '$sshPath -q -x -l backuppc -c arcfour $host 
nice -n 19 sudo $rsyncPath $argList+';

Not sure how to get those back into the new config.  Do they all go on one line 
on the RsyncSshArgs?  But clearly not all of them anymore, so what is still 
relevant?  Or do they go as separate items in that list?  I tried both, but 
they don't seem to work either.

The problem is, I think, I haven't touched the old configs for so long I can't 
remember all of how that was even set up, and I'm not making the jump to align 
with the new changes.

Suggestions?  Thanks!
 
Steve


More information below, from the initial post-upgrade configs before I started 
trying to change RsyncSshArgs:

LOG:
2017-03-21 23:41:10 removing incr backup 2165
2017-03-23 01:52:12 Created directory /mnt/stage/BackupPC/pc/miscnet/refCnt
2017-03-23 01:52:13 Copying v3 backup #2172 to v4 #2173
2017-03-23 02:58:25 BackupPC_refCountUpdate: doing fsck on misspent #2173 since 
there are no poolCnt files
2017-03-23 03:00:43 BackupPC_refCountUpdate: host miscnet got 0 errors (took 
138 secs)
2017-03-23 03:00:43 BackupPC_backupDuplicate: got 0 errors and 0 file open 
errors
2017-03-23 03:00:43 full backup started for directory /
2017-03-23 03:00:44 Got fatal error during xfer (No files dumped for share /)
2017-03-23 03:00:49 Backup aborted (No files dumped for share /)

XferLog:
XferLOG file /mnt/stage/BackupPC/pc/miscnet/XferLOG.2173.z created 2017-03-23 
01:52:13 
Backup prep: type = full, case = 2, inPlace = 1, doDuplicate = 1, newBkupNum = 
2173, newBkupIdx = 8, lastBkupNum = , lastBkupIdx =  (FillCycle = 0, noFillCnt 
= 4)
Executing /usr/local/BackupPC/bin/BackupPC_backupDuplicate -h miscnet
Xfer PIDs are now 7131
Copying v3 backup #2172 to v4 #2173
Xfer PIDs are now 7131,7795
BackupPC_refCountUpdate: doing fsck on miscnet #2173 since there are no poolCnt 
files
BackupPC_refCountUpdate: host miscnet got 0 errors (took 138 secs)
Xfer PIDs are now 7131
BackupPC_backupDuplicate: got 0 errors and 0 file open errors
Finished BackupPC_backupDuplicate (running time: 4110 sec)
Running: /usr/local/bin/rsync_bpc --bpc-top-dir /mnt/stage/BackupPC 
--bpc-host-name miscnet --bpc-share-name / --bpc-bkup-num 2173 --bpc-bkup-comp 
1 --bpc-bkup-prevnum -1 --bpc-bkup-prevcomp -1 --bpc-bkup-inode0 51267 
--bpc-attrib-new --bpc-log-level 1 -e /usr/bin/ssh\ -q\ -x\ -l\ backuppc 
--rsync-path=/usr/bin/rsync --super --recursive --protect-args --numeric-ids 
--perms --owner --group -D --times --links --hard-links --delete --partial 
--log-format=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ %f%L --stats --checksum 
--one-file-system --iconv=utf8,utf8 --exclude=/proc --exclude=/nfs 
miscnet.sga.org:/ /
full backup started for directory /
Xfer PIDs are now 7828
This is the rsync child about to exec /usr/local/bin/rsync_bpc
rsync_bpc: connection unexpectedly closed (0 bytes received so far) [Receiver]
Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 
sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 51267 inode
rsync error: unexplained error (code 255) at io.c(629) [Receiver=3.0.9.5]
rsync_bpc exited with fatal status 255 (65280) (rsync error: unexplained error 
(code 255) at io.c(629) [Receiver=3.0.9.5])
Xfer PIDs are now 
Got fatal error during xfer (No files dumped for share /)
Backup aborted (No files dumped for share /)
BackupFailCleanup: nFilesTotal = 0, type = full, BackupCase = 2, inPlace = 1, 
lastBkupNum = 
BackupFailCleanup: inPlace with no new files... no cleanup
Running BackupPC_refCountUpdate -h miscnet -f on miscnet
Xfer PIDs are now 7833
BackupPC_refCountUpdate: host miscnet got 0 errors (took 2 secs)
Xfer PIDs are now 
Finished BackupPC_refCountUpdate (running time: 2 sec)
Xfer PIDs are now 




--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.0.0 released

2017-03-22 Thread Steve Palm
I had installed BackupPC-XS and rsync-bpc, but I re-installed BackupPC-XS after 
the other module changes I made below, and now it is working.

Very strange.  Don't know what I could have done wrong, but oh well 
Wouldn't be the first time. LOL

BTW: I really like the more modern look from GitHub.


> On Mar 22, 2017, at 4:43 PM, Steve Palm <n9...@n9yty.com> wrote:
> 
> Okay, I installed from Github to take advantage of recent fixes, and the 
> upgrade worked fine. The web interface came up straight away, no problems.
> 
> If these problems are BECAUSE I used the GitHub version, please advise and I 
> will back it out or wait for resolution.  Actually, I did install the 
> released 4.0.0 version after this and it seems to make no difference.
> 
> I am documenting some things I already fixed here in case others find it 
> useful, but in the end I still do not have it working.
> 
> However, trying to backup the localhost (previous v3 backup), I received this 
> error:
> 
> backuppc: getaddrinfo is not a valid Socket macro at 
> /usr/local/BackupPC/lib/BackupPC/Lib.pm line 1413
> 
> It did a delete pass, but when I tell it to do a full backup I get the above 
> error in the log.
> 
> The strangest thing is, looking in the config, the host is defined as 
> 127.0.0.1, so how can that not be a valid address for 
> Socket::getaddrinfo($host); ?
> 
> I thought perhaps it was trying to use Socket::GetAddrInfo, which I was able 
> to install after having to install these:
> 
> File::Path File::Spec Test::More ExtUtils::CBuilder ExtUtils::CChecker
> 
> But still the same error.
> 
> For context, it is here in BackupPC:Lib, failing on the Socket::getaddrinfo 
> line.
> 
> sub getHostAddrInfo
> {
>my($bpc, $host) = @_;
>my($err, @addrs) = Socket::getaddrinfo($host);
>return undef if ( $err );
>return (($addrs[0])->{'family'} == Socket::AF_INET6) ? 6 : 4;
> }
> 
> Installed `cpan install Socket` and the line went away from the logs, but it 
> still is not making any backups. I don't know how to be sure all the modules 
> are up-to-date without causing problems. :(
> 
> Where to look, every log I look at doesn't show anything, it just logs that 
> the backup was requested but it never does anything. :(
> 
> Steve
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.0.0 released

2017-03-22 Thread Steve Palm
Okay, I installed from Github to take advantage of recent fixes, and the 
upgrade worked fine. The web interface came up straight away, no problems.

If these problems are BECAUSE I used the GitHub version, please advise and I 
will back it out or wait for resolution.  Actually, I did install the released 
4.0.0 version after this and it seems to make no difference.

I am documenting some things I already fixed here in case others find it 
useful, but in the end I still do not have it working.

However, trying to backup the localhost (previous v3 backup), I received this 
error:

backuppc: getaddrinfo is not a valid Socket macro at 
/usr/local/BackupPC/lib/BackupPC/Lib.pm line 1413

It did a delete pass, but when I tell it to do a full backup I get the above 
error in the log.

The strangest thing is, looking in the config, the host is defined as 
127.0.0.1, so how can that not be a valid address for 
Socket::getaddrinfo($host); ?

I thought perhaps it was trying to use Socket::GetAddrInfo, which I was able to 
install after having to install these:

File::Path File::Spec Test::More ExtUtils::CBuilder ExtUtils::CChecker

But still the same error.

For context, it is here in BackupPC:Lib, failing on the Socket::getaddrinfo 
line.

sub getHostAddrInfo
{
my($bpc, $host) = @_;
my($err, @addrs) = Socket::getaddrinfo($host);
return undef if ( $err );
return (($addrs[0])->{'family'} == Socket::AF_INET6) ? 6 : 4;
}

Installed `cpan install Socket` and the line went away from the logs, but it 
still is not making any backups. I don't know how to be sure all the modules 
are up-to-date without causing problems. :(

Where to look, every log I look at doesn't show anything, it just logs that the 
backup was requested but it never does anything. :(

Steve


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.0.0 released

2017-03-22 Thread Steve Palm
Thanks so much, Craig!  I really appreciate it. Looking forward to trying it 
out, although reading the list it might be better to wait for your 4.1.0 
release or grab the current from GitHub, as it seemed there are a few issues 
that are resolved which would affect us upgrading from v3.


> On Mar 22, 2017, at 11:17 AM, Craig Barratt <cbarr...@users.sourceforge.net 
> <mailto:cbarr...@users.sourceforge.net>> wrote:
> 
> Steve,
> 
> The error is that you don't have the required module "version" installed (ie, 
> running "perl -e 'use version;'" would fail).  Yes, the error is confusing 
> since "version" is such a generic name.
> 
> You can use CPAN (or your linux installer if that's what you normally use) to 
> install it:
> 
> sudo cpan
> install version
> 
> Craig
> 
> On Wed, Mar 22, 2017 at 7:40 AM, Steve Palm <n9...@n9yty.com 
> <mailto:n9...@n9yty.com>> wrote:
> Sorry, I sent this to the developer list, should probably have gone to the 
> users list... :(
> 
> On Mar 4, 2017, at 1:12 PM, Craig Barratt <cbarr...@users.sourceforge.net 
> <mailto:cbarr...@users.sourceforge.net>> wrote:
>> I'm happy to announce that BackupPC 4.0.0 has been released on Github 
>> <https://github.com/backuppc/backuppc/releases> and SourceForge 
>> <https://sourceforge.net/projects/backuppc/files/backuppc/4.0.0/>.  
> 
>  I was excited to read this, congratulations on a huge milestone!
> 
>> BackupPC 4.0.0 requires the perl module BackupPC::XS 
>> <https://github.com/backuppc/backuppc-xs/releases> (>= 0.50) and rsync-bpc 
>> <https://github.com/backuppc/rsync-bpc/releases> (>= 3.0.9.5). 
> 
>  I installed these.
> 
>> After installing those two packages, BackupPC 4.0.0 can be installed from 
>> the tar ball with:
>> 
>> tar zxf BackupPC-4.0.0.tar.gz
>> cd BackupPC-4.0.0
>> perl configure.pl <http://configure.pl/>
>  And I was greeted with a very unhelpful
> 
> BackupPC needs the package version.  Please install version
> before installing BackupPC.
> 
> Even running with a trace is not helping me much, not sure what I'm looking 
> at here:
> 
> perl -d:Trace configure.pl <http://configure.pl/>
> >> configure.pl:54 <http://configure.pl:54/>: my @ConfigureBinList = qw(
> >> configure.pl:79 <http://configure.pl:79/>: my @ConfigureLibList = qw(
> >> configure.pl:137 <http://configure.pl:137/>: if ( $ConfigureBinList[0] eq 
> >> "__" . "CONFIGURE_BIN_LIST__" ) {
> >> configure.pl:145 <http://configure.pl:145/>: my @Packages = qw(version 
> >> Encode File::Path File::Spec File::Copy DirHandle
> >> configure.pl:149 <http://configure.pl:149/>: my $PackageVersion = {
> >> configure.pl:154 <http://configure.pl:154/>: foreach my $pkg ( @Packages ) 
> >> {
> >> configure.pl:155 <http://configure.pl:155/>: eval "use $pkg";
> >> (eval 1)[configure.pl:155]:2: ;>> configure.pl:156 
> >> <http://configure.pl:156/>: if ( !$@ ) {
> >> configure.pl:169 <http://configure.pl:169/>: if ( $pkg =~ 
> >> /BackupPC::Lib/ ) {
> >> configure.pl:184 <http://configure.pl:184/>: die <<EOF;
> 
> BackupPC needs the package version.  Please install version
> before installing BackupPC.
> 
> Any ideas?
> 
>  Steve
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot <http://sdm.link/slashdot>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net 
> <mailto:BackupPC-users@lists.sourceforge.net>
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users 
> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>
> Wiki:http://backuppc.wiki.sourceforge.net 
> <http://backuppc.wiki.sourceforge.net/>
> Project: http://backuppc.sourceforge.net/ <http://backuppc.sourceforge.net/>
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org <http://slashdot.org/>! 
> http://sdm.link/slashdot___ 
> <http://sdm.link/slashdot___>
> BackupPC-users mailing list
> BackupPC-users@lis

Re: [BackupPC-users] Status on new BackupPC v4

2016-05-15 Thread Steve Willoughby
It’s going to be hard to find all 5 in one person.  I have 1, 2, and 4, which I 
suspect is the combination most of us possess.  5 is in terribly short supply 
for a lot of us too, so someone with 3 in particular, who can organize the 
efforts of many people who could contribute small bits of work would be quite 
valuable.

> On May 15, 2016, at 4:43 PM, Adam Goryachev 
>  wrote:
> 
> I had intentions of trying to pick up the project and start fixing 
> bugs/etc, but swiftly reached the end of my C programming experience, 
> and then time.
> 
> I honestly think it might be near impossible to find anyone to lead the 
> project that has all the needed assets:
> 1) Perl programming
> 2) C programming
> 3) Organisation skills
> 4) Git/CVS/Development skills
> 5) Time
> 
> However, someone with just 3 and 5 could easily be the right person. 
> There are a number of people supplying "patches" to the project, and the 
> right person only needs to "accept" the patches, push them into Git, and 
> release a new "version" from time to time.
> 
> I think the best approach is:
> 1) Send the email to Craig, and expect that no reply will be received. 
> If we get a response, great, lets move forward with his suggestions
> 2) Contact the owner of the backuppc/backuppc project on github and see 
> if anyone can get "admin" access to it, if not, create a new one called 
> backuppc-new or something similar (but try to show it is still the same 
> backuppc, in all honesty, the project is unlikely to make significant 
> changes to backuppc (and huge changes are probably not needed anyway)).
> 3) Try to give admin level access to a number of people that:
> a) Have been involved in the project for a reasonable length of time
> b) Contribute to support on the mailing list
> c) Have expressed an interest in supporting the project in any way
> This will allow for people to "drop out" over time, and still keep admin 
> access in the future. We don't want to get stuck in this place again, 
> with one "admin" and everybody else wondering what to do.
> 
> This also allows for "redundancy" of people. If we have 10 people, then 
> one of those 10 is probably available to contribute an hour or two out 
> of a week, even if it is a different "one" every week (ie, can you spare 
> 2 hours once every 10 weeks? Most people would probably say yes, but 
> probably can't actually commit to that, which is OK).
> 
> So, can you (the OP) do 1 and 2 above, give it 2 weeks, and then do 3?
> 
> PS, yes, I'd be happy to be one of the many people to contribute, but 
> clearly can't manage much, otherwise I would have already done it.
> 
> Regards,
> Adam
> 
> -- 
> Adam Goryachev Website Managers www.websitemanagers.com.au
> 
> --
> Mobile security can be enabling, not merely restricting. Employees who
> bring their own devices (BYOD) to work are irked by the imposition of MDM
> restrictions. Mobile Device Manager Plus allows you to control only the
> apps on BYO-devices by containerizing them, leaving personal data untouched!
> https://ad.doubleclick.net/ddm/clk/304595813;131938128;j
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/


--
Mobile security can be enabling, not merely restricting. Employees who
bring their own devices (BYOD) to work are irked by the imposition of MDM
restrictions. Mobile Device Manager Plus allows you to control only the
apps on BYO-devices by containerizing them, leaving personal data untouched!
https://ad.doubleclick.net/ddm/clk/304595813;131938128;j
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Status on new BackupPC v4

2016-05-15 Thread Steve Willoughby
I’ve been using BackupPC for many years and would really hate to see it die. 
It’s the best solution I’ve found in a very long time.  If we can encourage 
Craig to pick it up, or pick it up and run with it ourselves, or both (help 
Craig with it), whatever keeps this excellent tool alive and evolving is worth 
doing.

—steve

> On May 15, 2016, at 9:35 AM, Mauro Condarelli <mc5...@mclink.it> wrote:
> 
> 
> 
> Il 15/05/2016 16:14, Les Mikesell ha scritto:
>> On Sun, May 15, 2016 at 5:23 AM, Raoul Bhatia <ra...@bhatia.at> wrote:
>>> Second, maybe we have some Googlers lurking on this list who might know
>>> a better way to reach out to Craig?
>> Yes, it looks like Google might be keeping him busy these days:
>> 
>> http://www.wsj.com/articles/googles-wireless-effort-is-led-by-a-geeks-geek-1421973826
>> http://www.recode.net/2016/4/14/11586114/access-google-fiber-ceo-interview
>> 
>> He does have a linkedin page but I don't know if that would work any
>> better to reach him.
> If You have a LinkedIn account please retrieve Craig's contacts.
> I have no account and I would rather not subscribe (I'm not a fan of "social 
> nets" of any kind).
> 
> I will rewrite the mail following Raoul advice, but that is useless if we 
> cant find
> a usable e-mail address :/
> 
> Thanks to all who expressed appreciation for the initiative.
> 
> Regards
> Mauro
> 
> 
> --
> Mobile security can be enabling, not merely restricting. Employees who
> bring their own devices (BYOD) to work are irked by the imposition of MDM
> restrictions. Mobile Device Manager Plus allows you to control only the
> apps on BYO-devices by containerizing them, leaving personal data untouched!
> https://ad.doubleclick.net/ddm/clk/304595813;131938128;j
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/


--
Mobile security can be enabling, not merely restricting. Employees who
bring their own devices (BYOD) to work are irked by the imposition of MDM
restrictions. Mobile Device Manager Plus allows you to control only the
apps on BYO-devices by containerizing them, leaving personal data untouched!
https://ad.doubleclick.net/ddm/clk/304595813;131938128;j
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Lots of nt_status_access_denied errors 2012R2

2014-02-12 Thread Steve Crow
Adding the backuppc user account to the Backup Operators group made no
difference.
Any other ideas or experiences to share with Server 2012R2 ?

Thanks!
Steve



--
Android apps run on BlackBerry 10
Introducing the new BlackBerry 10.2.1 Runtime for Android apps.
Now with support for Jelly Bean, Bluetooth, Mapview and more.
Get your Android app in front of a whole new audience.  Start now.
http://pubads.g.doubleclick.net/gampad/clk?id=124407151iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Lots of nt_status_access_denied errors 2012R2

2014-02-11 Thread Steve Crow
We're trying to backup Server 2012R2 from a linux box.

We are receiving a lot of these types of errors:

NT_STATUS_ACCESS_DENIED listing \Users\Administrator\AppData\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Application Data\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Contacts\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Local Settings\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Music\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\My Documents\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\NetHood\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Pictures\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Templates\* 
NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Videos\*

What causes these and how can we get all files backed up properly?

Thank you!

Steve



--
Android apps run on BlackBerry 10
Introducing the new BlackBerry 10.2.1 Runtime for Android apps.
Now with support for Jelly Bean, Bluetooth, Mapview and more.
Get your Android app in front of a whole new audience.  Start now.
http://pubads.g.doubleclick.net/gampad/clk?id=124407151iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Lots of nt_status_access_denied errors 2012R2

2014-02-11 Thread Steve Crow
Thank you for your input. A domain is not involved. 2012R2 is so different I
could not find a way to add the Backup Operators group. I will keep looking.
The user account is an Administrator though.

Steve

-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com] 
Sent: Tuesday, February 11, 2014 11:16 AM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] Lots of nt_status_access_denied errors 2012R2

On Tue, Feb 11, 2014 at 10:46 AM, Steve Crow sc...@amarilloheartgroup.com
wrote:
 We're trying to backup Server 2012R2 from a linux box.

 We are receiving a lot of these types of errors:

 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\AppData\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Application 
 Data\* NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Contacts\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Local Settings\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Music\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\My Documents\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\NetHood\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Pictures\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Templates\* 
 NT_STATUS_ACCESS_DENIED listing \Users\Administrator\Videos\*

 What causes these and how can we get all files backed up properly?

That basically means that the user backuppc connects as does not have access
to those locations.  Windows permissions can be somewhat weird.
  Is a domain involved, and is backuppc user a member of the Backup
Operators group?

Backuppc uses smbtar for the smb xfer operation, but it might be easier to
debug things if you use smbclient interactively to try to access these
locations.

-- 
Les Mikesell
 lesmikes...@gmail.com


--
Android apps run on BlackBerry 10
Introducing the new BlackBerry 10.2.1 Runtime for Android apps.
Now with support for Jelly Bean, Bluetooth, Mapview and more.
Get your Android app in front of a whole new audience.  Start now.
http://pubads.g.doubleclick.net/gampad/clk?id=124407151iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


--
Android apps run on BlackBerry 10
Introducing the new BlackBerry 10.2.1 Runtime for Android apps.
Now with support for Jelly Bean, Bluetooth, Mapview and more.
Get your Android app in front of a whole new audience.  Start now.
http://pubads.g.doubleclick.net/gampad/clk?id=124407151iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Lots of nt_status_access_denied errors 2012R2

2014-02-11 Thread Steve Crow
I found the computer management gui (I'm too green on this version!) I've
added the backuppc user to the Backup Operators group. I'll see what happens
tonight.
Thanks again for your help and input!

Steve



--
Android apps run on BlackBerry 10
Introducing the new BlackBerry 10.2.1 Runtime for Android apps.
Now with support for Jelly Bean, Bluetooth, Mapview and more.
Get your Android app in front of a whole new audience.  Start now.
http://pubads.g.doubleclick.net/gampad/clk?id=124407151iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Can't create test hardlink error

2013-08-01 Thread Steve Blackwell

--
Get your SQL database under version control now!
Version control is standard for application code, but databases havent 
caught up. So what steps can you take to put your SQL databases under 
version control? Why should you start doing it? Read more to find out.
http://pubads.g.doubleclick.net/gampad/clk?id=49501711iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Can't create test hardlink error

2013-07-28 Thread Steve Blackwell
I just rebooted to a new kernel and now I'm getting this error:

Can't create a test hardlink between a file in /var/lib/BackupPC//pc
and /var/lib/BackupPC//cpool.  Either these are different file systems,
or this file system doesn't support hardlinks, or these directories
don't exist, or there is a permissions problem, or the file system is
out of inodes or full.  Use df, df -i, and ls -ld to check each of
these possibilities. Quitting...

I don't think it has anything to do with the new kernel. It's just
that rebooting made me notice the problem.
I'm thinking it's a permissions
problem since there is on 6% space used and 1% of the inodes. Here's
some relevant info:

# cd /var/lib
# ls -l | grep BackupPC
lrwxrwxrwx. 1 backuppc backuppc   44 Apr 29 20:32 BackupPC
- /media/5d194bff-a222-49a8-bd90-4898fff9bdd8/
# cd BackupPC
ls -l
total 180
drwxr-x---. 18 backuppc backuppc  4096 Jun 11 01:00 cpool
-rw-r--r--.  1 root root   145 Mar 19 22:00 extlinux.conf
-r--r--r--.  1 root root 32768 Mar 19 22:00 ldlinux.sys
drwx--.  2 root root 16384 Jul 16  2008 lost+found
-rw-r--r--.  1 root root 60928 Mar 19 22:00 menu.c32
drwxr-x---.  6 backuppc backuppc  4096 Jun  1 13:02 pc
drwxr-x---.  2 backuppc backuppc  4096 Jul 26  2008 pool
-rw-r--r--.  1 root root   145 Mar 19 22:00 syslinux.cfg
drwxr-x---.  2 backuppc backuppc  4096 Jun 11 01:12 trash
-rw-r--r--.  1 root root 0 Mar 19 22:00 ubnfilel.txt
-rw-r--r--.  1 root root 0 Mar 19 22:00 ubnpathl.txt

All the contents of cpool and pc all have the same permissions:
drwxr-x---. 18 backuppc backuppc

Any ideas?
Thanks,
Steve

--
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Email configuration

2013-06-01 Thread Steve Blackwell
On Sat, 1 Jun 2013 16:32:15 -0500
Michael Stowe mst...@chicago.us.mensa.org wrote:

 
  It also seems to imply that the username you're using is backuppc
  rather than zephod.  This will be set under hosts.
 
 Errr, you can safely ignore that, I was thinking of something else
 entirely, but it does appear your configuration isn't being read or
 used.
 
 I recommend trying out the -t switch to see if anything would be sent,
 
 And the -u switch to confirm your email is set up properly.  If those
 work, then it's simply your config.
Hmmm...
I assumed that the -u switch meant run the command as user whoever...
but I was wrong.
It seems that it means send the e-mail to whoever.

So...
# sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_sendEmail -u 
zep...@cfl.rr.com

results in this in /var/log/maillog:
Jun  1 18:18:18 localhost postfix/pickup[9126]: 907191840E9: uid=496 
from=backuppc
Jun  1 18:18:18 localhost postfix/cleanup[9278]: 907191840E9: 
message-id=20130601221818.907191840E9@stevesDesktop.localdomain
Jun  1 18:18:18 localhost postfix/qmgr[4236]: 907191840E9: 
from=backuppc@stevesDesktop.localdomain, size=497, nrcpt=1 (queue active)
Jun  1 18:18:18 localhost postfix/smtp[9269]: 907191840E9: host 
cdptpa-smtpin01.mail.rr.com[75.180.132.243] refused to talk to me: 554 5.7.1 - 
ERROR: Mail refused - 184.90.56.138 - See 
http://www.spamhaus.org/query/bl?ip=184.90.56.138
Jun  1 18:18:19 localhost postfix/smtp[9269]: 907191840E9: 
to=zep...@cfl.rr.com, relay=cdptpa-smtpin02.mail.rr.com[75.180.132.244]:25, 
delay=0.94, delays=0.61/0/0.33/0, dsn=4.7.1, status=deferred (host 
cdptpa-smtpin02.mail.rr.com[75.180.132.244] refused to talk to me: 554 5.7.1 - 
ERROR: Mail refused - 184.90.56.138 - See 
http://www.spamhaus.org/query/bl?ip=184.90.56.138)

which makes more sense.

Now I need to find out how to get the server to listen.
OK, it appears from the URL given that my IP address is blocked because it is 
on some Policy Block List.
But my IP address comes from my ISP via DHCP. Are they blocking their own IP 
address?
It looks like I can remove it from the list but I'd like to understand this a 
little better first.

Steve

--
Get 100% visibility into Java/.NET code with AppDynamics Lite
It's a free troubleshooting tool designed for production
Get down to code-level detail for bottlenecks, with 2% overhead.
Download for free and get started troubleshooting in minutes.
http://p.sf.net/sfu/appdyn_d2d_ap2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore after upgrade advice?

2013-05-01 Thread Steve

 backu...@kosowsky.org wrote: 
 Steve wrote at about 23:51:11 -0400 on Monday, April 29, 2013:
   I'd run out of time and decided it must be a filesystem problem and I 
 would look at it more tomorrow.
   
   However...
   
   # ls -l /var/lib/BackupPC
   drwxr-x---. 18  495 root  4096 Mar 18 06:00 cpool
   -rw-r--r--.  1 root root   145 Mar 19 22:00 extlinux.conf
   -r--r--r--.  1 root root 32768 Mar 19 22:00 ldlinux.sys
   drwx--.  2 root root 16384 Jul 16  2008 lost+found
   -rw-r--r--.  1 root root 60928 Mar 19 22:00 menu.c32
   drwxr-x---.  5  495 root  4096 Mar 17 19:31 pc
   drwxr-x---.  2  495 root  4096 Jul 26  2008 pool
   -rw-r--r--.  1 root root   145 Mar 19 22:00 syslinux.cfg
   drwxr-x---.  2  495 root  4096 Mar 18 06:00 trash
   -rw-r--r--.  1 root root 0 Mar 19 22:00 ubnfilel.txt
   -rw-r--r--.  1 root root 0 Mar 19 22:00 ubnpathl.txt
   
   Hmmm... there is no user 495. Perhaps that was the backuppc user on my old 
 system.
   Do cpool, pc etc have to be owned by backuppc?
 
 BINGO.
 The program is run as user 'backuppc' so the data must be
 readable/writeable by user backuppc.
 
 you need to do something like:
 
 chown -R backuppc pc cpool pool trash

Yup, that gets backuppc to start and I can get to the admin web page.

On to the next problem. I expected to see all my backups so I could start 
restoring but there are none.
Then I remembered that I named the computer differently when I upgraded.
I see in /var/lib/BackupPC/pc there is a directory called steve which was the 
name of my old computer so I tried renaming that to stevesDesktop which is 
the new name. (I also have a stevesLaptop in case you were wondering) I 
restarted backuppc but I still don't see my old backups.

I noticed on the host status page that it knows about a host called 
stevesdesktop (all lower case) so I'll try renaming to that tonight.

Steve.


--
Introducing AppDynamics Lite, a free troubleshooting tool for Java/.NET
Get 100% visibility into your production application - at no cost.
Code-level diagnostics for performance bottlenecks with 2% overhead
Download for free and get started troubleshooting in minutes.
http://p.sf.net/sfu/appdyn_d2d_ap1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore after upgrade advice?

2013-04-29 Thread Steve

 backu...@kosowsky.org wrote: 
 Steve wrote at about 13:06:18 -0400 on Saturday, April 27, 2013:
   It's time for an upgrade. I've been putting this off for a long time but 
 my motherboard died and has been replaced so now is as good a time as any.
   I'm currently running Fedora 12 for which I have a full backup.
   I'm going to go to CentOS 6.4.
   
   Any gotyas to be aware of? 
   Any similar experiences anyone would care to share?
   I hoping it will be as simple as install CentOS, setup BackupPC, and 
 restore my home directories.
   
   Thanks,
   Steve
 
 One of these days I will be doing the same thing... I am running
 Fedora 12 on an ancient P4 system (almost 11 years old)... I got sick
 of the every 6 month Fedora update where it would take the first month
 or two just to get it all stable... so I plan to switch to the latest
 Centos.
 
 I would love to hear about your experiences -- good, bad, and
 gotchas...
 
I had the same as you - F12 on a P4 system. F13 increased the required boot 
partition size and I didn't feel like reinstalling at the time. Then Fedora 
went off the rails with their GNOME3 experiment which put me off completely. 
Finally though, the caps gave ot on my motherboard so I bought a new 
motherboard, processor and memory. I also had to get a new optical drive since 
the new motherboard had no IDE interface.

Anyhow the installation of CentOS 6.4 went very smoothly. The only issue is 
that the networking doesn't start automatically. I have to press a button in 
the top menu bar. I'll look into that later.

I am having some issues setting up backuppc. First of all, backuppc is not in 
the standard Centos repoitories. I found a web page (don't have the URL at the 
moment) that described installing backupc on Centos 6.3 and got it installed. I 
set up apache from my old notes but right now I can't get my browser to go to 
backuppc admin page. It's not authenticating even though I know I have the user 
and password correct.

I didn't get to spend too much time on it so I'm not panicing yet.

Steve.


--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore after upgrade advice?

2013-04-29 Thread Steve

 Les Mikesell lesmikes...@gmail.com wrote: 
 On Mon, Apr 29, 2013 at 9:15 AM, Steve zep...@cfl.rr.com wrote:
 
  Anyhow the installation of CentOS 6.4 went very smoothly. The only issue is 
  that the networking doesn't start automatically. I have to press a button 
  in the top menu bar. I'll look into that later.
 
 During install you are supposed to check the 'set up networking' box
 where you can set the options.  For a wired server you might want to
 just turn of NetworkManager and be sure you have the right things in
 the /etc/sysconfig/network-scripts/ifcfg-eth? file.
 
  I am having some issues setting up backuppc. First of all, backuppc is not 
  in the standard Centos repoitories. I found a web page (don't have the URL 
  at the moment) that described installing backupc on Centos 6.3 and got it 
  installed. I set up apache from my old notes but right now I can't get my 
  browser to go to backuppc admin page. It's not authenticating even though I 
  know I have the user and password correct.
 
  I didn't get to spend too much time on it so I'm not panicing yet.
 
 Backuppc is in the EPEL repository.  You are likely to want other
 things from there too.

OK, so my authentication problem is solved. /etc/httpd/conf.d/BackupPC.conf had 
AuthUserFile set to /etc/BackupPC/apache.users but I had created it in 
/etc/httpd/passwd. I guess the preferred location has changed over the years. I 
modified the BackupPC.conf file to point to /etc/httpd/passwd and now I can get 
to the BackupPC server.
Any specific reason to rebuild the password file in the default location with 
the default name?

Or I should say I could get to the server before I tried to configure the 
BackupPC server to point at my old backup.
The problem is that my backups are on an ext3 filesystem whereas my new CentOS 
6.4 is partitioned with ext4.

I can use tune3fs to convert ext3 to ext4 but I'm a little nervous about doing 
it. If it goes wrong, there go my backups.
On the other hand the backups are no use if I can't get to them.

Anyone done this before? The h/w the backups are on is an external 1T USB 
MyBook.

Thanks,
Steve


--
Introducing AppDynamics Lite, a free troubleshooting tool for Java/.NET
Get 100% visibility into your production application - at no cost.
Code-level diagnostics for performance bottlenecks with 2% overhead
Download for free and get started troubleshooting in minutes.
http://p.sf.net/sfu/appdyn_d2d_ap1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore after upgrade advice?

2013-04-29 Thread Steve

 backu...@kosowsky.org wrote: 
 Steve wrote at about 01:13:15 + on Tuesday, April 30, 2013:
   Or I should say I could get to the server before I tried to configure 
 the BackupPC server to point at my old backup.
   The problem is that my backups are on an ext3 filesystem whereas my new 
 CentOS 6.4 is partitioned with ext4.
   
   I can use tune3fs to convert ext3 to ext4 but I'm a little nervous about 
 doing it. If it goes wrong, there go my backups.
   On the other hand the backups are no use if I can't get to them.
   
   Anyone done this before? The h/w the backups are on is an external 1T USB 
 MyBook.
 
 I'm confused... if your backups are on an ext3 filesystem then just
 mount them ext3
 CentOS (like any *nix) can work with multiple filesystem types
 simultaneously - ext2/3/4, ntfs, reiserfs, vfat, solaris, hfs,
 etc. 
 
 In fact, 'mount' is usually able to figure out the filesystem type
 automatically... worst case, if it can't specify -t ext3 on the
 command line or give the equivalent parameter in your fstab.
 
 Saying my new CentOS 6.4 is partitioned with ext4 really makes
 no sense -- disks are partitioned, filesystems sit on individual
 partitions and are formatted, the actual distro (CentOS 6.4) is not
 partitioned nor does it have a fixed associated filesystem...
 
 Perhaps, though I am not understanding your issue...

Or perhaps I'm misinterpreting the error message. 

The external drive is mounted ext3. 
I created a symbolic link from /var/lib/BackupPC to /media/my external HD uuid
When I try to restart the backuppc service I get this error:

Starting BackupPC: 2013-04-29 20:37:21 Can't create a test hardlink between a 
file in /var/lib/BackupPC//pc and /var/lib/BackupPC//cpool.  Either these are 
different file systems, or this file system doesn't support hardlinks, or these 
directories don't exist, or there is a permissions problem, or the file system 
is out of inodes or full.  Use df, df -i, and ls -ld to check each of these 
possibilities. Quitting..

I'd run out of time and decided it must be a filesystem problem and I would 
look at it more tomorrow.

However...

# ls -l /var/lib/BackupPC
drwxr-x---. 18  495 root  4096 Mar 18 06:00 cpool
-rw-r--r--.  1 root root   145 Mar 19 22:00 extlinux.conf
-r--r--r--.  1 root root 32768 Mar 19 22:00 ldlinux.sys
drwx--.  2 root root 16384 Jul 16  2008 lost+found
-rw-r--r--.  1 root root 60928 Mar 19 22:00 menu.c32
drwxr-x---.  5  495 root  4096 Mar 17 19:31 pc
drwxr-x---.  2  495 root  4096 Jul 26  2008 pool
-rw-r--r--.  1 root root   145 Mar 19 22:00 syslinux.cfg
drwxr-x---.  2  495 root  4096 Mar 18 06:00 trash
-rw-r--r--.  1 root root 0 Mar 19 22:00 ubnfilel.txt
-rw-r--r--.  1 root root 0 Mar 19 22:00 ubnpathl.txt

Hmmm... there is no user 495. Perhaps that was the backuppc user on my old 
system.
Do cpool, pc etc have to be owned by backuppc?

Steve.

--
Introducing AppDynamics Lite, a free troubleshooting tool for Java/.NET
Get 100% visibility into your production application - at no cost.
Code-level diagnostics for performance bottlenecks with 2% overhead
Download for free and get started troubleshooting in minutes.
http://p.sf.net/sfu/appdyn_d2d_ap1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Restore after upgrade advice?

2013-04-27 Thread Steve
It's time for an upgrade. I've been putting this off for a long time but my 
motherboard died and has been replaced so now is as good a time as any.
I'm currently running Fedora 12 for which I have a full backup.
I'm going to go to CentOS 6.4.

Any gotyas to be aware of? 
Any similar experiences anyone would care to share?
I hoping it will be as simple as install CentOS, setup BackupPC, and restore my 
home directories.

Thanks,
Steve

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Per-PC pools

2013-03-15 Thread Steve
On Fri, Mar 15, 2013 at 11:44 AM, Tyler J. Wagner ty...@tolaris.com wrote:
 On 2013-03-15 15:27, Tyler J. Wagner wrote:
 I am also not happy with the snarky responses lately.

I understand both points of view - while this is supposed to be the
users list, it has become the list for just about everything
backupPC.  So coding, development  (and people knowledgeable on those)
regularly respond alongside the guru users.  It's easy to get annoyed
with stupid development/coding questions when you just want the
person asking the question to use backuppc as it was designed (and
visa versa).

I forgive everyone, and appreciate all the time you put into backuppc :)

-- 
The fun parts of life are mostly optional.

--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_mar
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync never starts transferring files (but does something)

2012-11-15 Thread Steve
On Thu, Nov 15, 2012 at 12:22 PM, Markus unive...@truemetal.org wrote:
 Any suggestions on what I could do or what could go wrong here?

Since your things are working well on all other machines, try a backup
on the trouble machine of just one directory (or a few) and see if
that works normally.  If it does, your guess about the large size of
the backup makes sense; just break that client up into multiple
separate backups and you should be fine.  Backuppc can still consider
it all under one client if you setup aliasing correctly.

A.


-- 
The fun parts of life are mostly optional.

--
Monitor your physical, virtual and cloud infrastructure from a single
web console. Get in-depth insight into apps, servers, databases, vmware,
SAP, cloud infrastructure, etc. Download 30-day Free Trial.
Pricing starts from $795 for 25 servers or applications!
http://p.sf.net/sfu/zoho_dev2dev_nov
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync never starts transferring files (but does something)

2012-11-15 Thread Steve
On Thu, Nov 15, 2012 at 1:14 PM, Markus unive...@truemetal.org wrote:
 Your suggestion sounds great. I just found this small how-to on a forum.
 Is this how it works or is there another/better way?

 Create as many client names as you like, eg: client-share1,
 client-share2, client-share3, client-share4 (replace client with the
 real host name and share with the share names). In each
 pc/client-xxx/config.pl file, use;

 $Conf{ClientNameAlias} = client;

 (where client is the real host name). Add any other client-specific
 settings (eg: share name). This way all 4 virtual clients will refer to
 the same real client. All backups happen independently.

That's what I meant.  Good luck!

A.

-- 
The fun parts of life are mostly optional.

--
Monitor your physical, virtual and cloud infrastructure from a single
web console. Get in-depth insight into apps, servers, databases, vmware,
SAP, cloud infrastructure, etc. Download 30-day Free Trial.
Pricing starts from $795 for 25 servers or applications!
http://p.sf.net/sfu/zoho_dev2dev_nov
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Copy host backups from one BackupPC server to another

2012-07-30 Thread Steve
If you had space somewhere and it is only a few old machines...it
wouldnt be too much trouble to just restore them someplace and then
let the new server back up the restored directories...that way you get
to keep the pooling advantages.
A.

On Mon, Jul 30, 2012 at 7:04 AM, Jonas Meurer jo...@freesources.org wrote:
 Hello,

 Am 30.07.2012 10:55, schrieb Tim Fletcher:
 On 30/07/12 09:43, Jonas Meurer wrote:
 what's the prefered way to copy/move backups of one or more hosts
 from
 one BackupPC instance to another?

 Normal solutions are:

 1. Copy the raw disks from the old server to the new server (eg via
 dd and ssh or netcat)

 2. Leave the old server running and setup a new one, allow the new
 one to populate and retire the old server when the archives are no
 longer needed

 Unfortunately neither of both solutions work for me. I setup a new
 backup-server which replaces the old one. For all active hosts, I did it
 exactly as your second solution suggested. By now, the new backup-server
 backupped more than one full backup cycle of all active hosts.

 But my problem is with inactive hosts. I have some old hosts which
 aren't available anymore, but for which I keep the last backups in the
 system. I'd like to migrate these backups to the new backup server as
 well, as the old one will be powered off in near future.

 I cannot believe that it's impossible to move backups of hosts from one
 BackupPC instance to another. And actually I believe that
 BackupPC_tarPCCopy is the right tool to do so. But I don't get how to
 use it.

 Regards,
   jonas


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



-- 
The fun parts of life are mostly optional.

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-18 Thread Steve Kieu
 under frootfs  folder.


 When the target is a linux system, why not just use rsync over ssh?



Because I do not want to  spend cpu time on ssh - And why not rsyncd for
local LAN. It should be done that way when encryption is not needed.


Backuppc hardlinks all identical files for its own storage - which is
 most of the point of using it.   Normal tar will only include
 hardlinks as links in its archive if it has the content earlier in the
 stream, so I'm not sure what is going on here.   I haven't used the
 rsyncd setup in combination with archivehost so I'm not sure what to
 expect there.  If the module name is included as a top level directory
 in the tar output, I would have expected extraction to have created it
 and pushed everything else down a level.  Is there something in the


Yes it created the rootfs dir and put everything down to that folder.
Extract using normal tar xzf



 way you extract that would prevent that?Does this scenario happen
 in any other case?   Maybe it is somehow confused by the module name
 and a directory being named the same, or by what happens to '/' as the
 real top level when it is converted to a relative path in extraction.


Directory name  (rootfs) is generated by backuppc - It think It is based on
the rsynd module name (if I named it differently then the foldername is
changed as well).

When doing restore and select download as Tar archive, all is fine, the tar
archive have no hardlink in it, and I can extract it correctly. I noticed
that there is no top level folder rootfs anymore.

So there might be somewhere when doing archive it add the top folder (from
looking at the frootfs and generated folder name rootfs and put all thins
under it, that confused tar.

Thanks





 --
   Les Mikesell
  lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-18 Thread Steve Kieu
 Does anyone choose to deal with this by simply specifying no encryption as
 an ssh option?


I have heard that ssh no longer support option cipher=none but I will
recheck - last time I did not work for me





 Shawn


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-18 Thread Steve Kieu
 III days, but really, what else does your CPU have to do during the
 backup time?  You are mostly disk bound anyway -  unless maybe you are
 using IDE or USB drives that need CPU for the I/O work.


We have a server (nagios) that constantly having load around 15 to 20 -
only two cores. Before we usd rsync over ssh and usually the backup lasts
forever  to finish and it is not good as nightly backup lasts the whole
next day and until the next nightly backup, sometimes it has not finished
it yet.

I therefore change all to use rsyncd module  and backuppc is happy ever
since -

Nevertheless that problem needs to be fixed anyway - I hope I have sometime
this weekend to look at the backuppc source code and see what I can do
about it. In the mean time, restore from tar archive works for now so it is
not kind of show stopper ...

Cheers






 --
   Les Mikesell
 lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-18 Thread Steve Kieu
I think the problem is that your rsync module name doesn't match the

 mountpoint.  I don't know if that is even possible for '/'.  But the
 error message still does not make sense to me.


I can not use /  as rsyncd module name. So I must use something else.
I would like to emphasize that using rsyncd module is perfectly legitimate
and good use. The bug needs to be fixed - I will do it  when I can
prioritize my time though ,

cheers




 --
Les Mikesell
 lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-16 Thread Steve Kieu
 What is the ./rootfs/ directory, and why is that not the place it is
 trying to write?   Do you actually have a hardlinked structure like
 that on the backup target?


I use the rsyncd and modules name for path = / is rootfs . Then backup the
whole root (with some exclude of course) using backuppc. Backuppc write it
under frootfs  folder.

When doing archive backupps put all file under / into a folder rootfs .

Nope backup target is normal centos 6 system so no hardlink at all for such
like /sbin/e2fsck , etc.. . When backuppc does the backup as there are two
nealry identical target  system it makes hardlink for the second one I
guess to same disk space. But when dong an archive it should dereference
the hard link -



--
   Les Mikesell
 lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-14 Thread Steve Kieu
 I don't think I've ever seen that.  Is there some simple way to reproduce
 it?



Yes just choose archive host and select a host and archive it, choose tar
gz as format, and type the path to the file. Then move the file to some
other boxes probably different OS than the archived host  like in my case
archived host is RHEL 5 and I moved to my Ubuntu desktop. Then extract it
using tar xf command

For now I have to use the restore options and download as tar archive,
which works.





 --
  Les Mikesell
 lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Update on BackupPC Development / Is backuppc dead?

2012-06-14 Thread Steve
On Thu, Jun 14, 2012 at 4:21 PM, Philipp Raschdorff
p.raschdo...@googlemail.com wrote:
 Questions:
 1) Is BackupPC development dead?

Just because a piece of software doesn't need to be patched every two
weeks doesn't mean no one is working on it :)

My novice understanding is that the next release is a whopper which
fundamentally changes many things about how BackupPC works...so it's
taking a long time.  The current version, in my opinion, enjoys
excellent support on this list.

 2) Windows 7 and BackupPC ... will it work?

Yes.

A.

-- 
The fun parts of life are mostly optional.

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive host tar and hardlink problems

2012-06-14 Thread Steve Kieu
Hello


That has always been filled for me.

  For now I have to use the restore options and download as tar archive,
 which
  works.

 I think both approaches should do the same thing.  And the same for
 running Backuppc_tarCreate from the command line.  Not sure where to
 start to debug it, though.   Most oddball bugs are from having some
 old/wrong perl module on the system somewhere.   Have you installed
 any CPAN modules instead of .deb packaged versions?


It is running on redhat 5 uptodate system

cat /etc/redhat-release
Red Hat Enterprise Linux Server release 5.8 (Tikanga)

[root@backup01 ~]# rpm -qa|grep perl
perl-DBI-1.52-2.el5
perl-Compress-Zlib-1.42-1.fc6
perl-libwww-perl-5.805-1.1.1
perl-BSD-Resource-1.28-1.fc6.1
newt-perl-1.08-9.2.2
perl-XML-Parser-2.34-6.1.2.2.1
perl-String-CRC32-1.4-2.fc6
perl-Convert-ASN1-0.20-1.1
mod_perl-2.0.4-6.el5
perl-Archive-Zip-1.16-1.2.1
perl-HTML-Tagset-3.10-2.1.1
perl-DBD-MySQL-3.0007-2.el5
perl-5.8.8-38.el5
perl-suidperl-5.8.8-38.el5
perl-URI-1.35-3
perl-HTML-Parser-3.55-1.fc6

I have just been able to reproduce it with completely new test backuppc
server on Centos 6

[root@catch9log tmp]# cat /etc/redhat-release
CentOS release 6.2 (Final)

[root@catch9log tmp]# rpm -qa|grep perl
perl-suidperl-5.10.1-119.el6_1.1.x86_64
perl-Params-Validate-0.92-3.el6.x86_64
perl-Compress-Zlib-2.020-119.el6_1.1.x86_64
perl-version-0.77-119.el6_1.1.x86_64
perl-XML-Parser-2.36-7.el6.x86_64
perl-CGI-3.51-119.el6_1.1.x86_64
perl-DateTime-Format-Mail-0.3001-6.el6.noarch
perl-Compress-Raw-Zlib-2.023-119.el6_1.1.x86_64
perl-URI-1.40-2.el6.noarch
perl-IO-Compress-Zlib-2.020-119.el6_1.1.x86_64
perl-Module-Pluggable-3.90-119.el6_1.1.x86_64
perl-Archive-Zip-1.30-2.el6.noarch
perl-Pod-Simple-3.13-119.el6_1.1.x86_64
perl-File-RsyncP-0.68-6.el6.x86_64
perl-5.10.1-119.el6_1.1.x86_64
perl-HTML-Tagset-3.20-4.el6.noarch
perl-libwww-perl-5.833-2.el6.noarch
perl-Time-modules-2006.0814-5.el6.noarch
perl-Net-FTP-RetrHandle-0.2-3.el6.noarch
perl-List-MoreUtils-0.22-10.el6.x86_64
perl-DateTime-Format-W3CDTF-0.04-8.el6.noarch
perl-XML-RSS-1.45-2.el6.noarch
perl-IO-Compress-Base-2.020-119.el6_1.1.x86_64
perl-Pod-Escapes-1.04-119.el6_1.1.x86_64
perl-libs-5.10.1-119.el6_1.1.x86_64
perl-HTML-Parser-3.64-2.el6.x86_64
perl-Class-Singleton-1.4-6.el6.noarch
perl-DateTime-0.5300-1.el6.x86_64
perl-Net-FTP-AutoReconnect-0.3-3.el6.noarch

I do not install any CPAN module on the test server.

It has only two host to backup (for testing purposes) - do full dump host1
and host2 - they are all Centos 6 system. Dump host1 first, then finished,
then dump host2. and than archive host2. After that extract it suing tar
from my Ubuntu 10.10 Macbook

tar: ./rootfs/usr/sbin/named: Cannot hard link to `usr/sbin/lwresd': No
such file or directory
tar: ./rootfs/sbin/fsck.ext2: Cannot hard link to `sbin/e2fsck': No such
file or directory
tar: ./rootfs/sbin/fsck.ext3: Cannot hard link to `sbin/e2fsck': No such
file or directory
tar: ./rootfs/sbin/fsck.ext4: Cannot hard link to `sbin/e2fsck': No such
file or directory
tar: ./rootfs/sbin/fsck.ext4dev: Cannot hard link to `sbin/e2fsck': No such
file or directory
tar: ./rootfs/sbin/mkfs.ext2: Cannot hard link to `sbin/mke2fs': No such
file or directory
tar: ./rootfs/sbin/mkfs.ext3: Cannot hard link to `sbin/mke2fs': No such
file or directory
tar: ./rootfs/sbin/mkfs.ext4: Cannot hard link to `sbin/mke2fs': No such
file or directory
tar: ./rootfs/sbin/mkfs.ext4dev: Cannot hard link to `sbin/mke2fs': No such
file or directory
tar: ./rootfs/sbin/tune2fs: Cannot hard link to `sbin/e2label': No such
file or directory


Obvoulsy it is reproducable here.

BackupPC installed from BackupPC-3.2.1.tar.gz into /usr/local/BackupPC


cheers,





 --
   Les Mikesell
  lesmikes...@gmail.com


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https

[BackupPC-users] archive host tar and hardlink problems

2012-06-11 Thread Steve Kieu
Hello everyone,

I used the archive host and archive one host to a file. When extracting the
file I saw many error like:

tar: ./rootfs/usr/lib/locale/fr_FR/LC_TELEPHONE: Cannot hard link to
`usr/lib/locale/br_FR/LC_TELEPHONE': No such file or directory

I guess when backupPC doing teh tar command it does not specify option
--hard-dereference so the hardlink is replaced by actual file content. The
system I do a extract does not have the target for the link.

Is it intentional? If so I would argue that the archive dump is not
complete and usable to be offline storage as it does not self contained
file system. I am currently testing the Restore options as downloading the
tar restore file to see if it behaves the same way.

Is there any option to fix that?

Regards,




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Strange behaviour when download the restore tar archive

2012-05-18 Thread Steve Kieu
Hi all,

Today I did a test to see if I can restore the whole root / file system
from backup pc. SO go to the cgi interface, browse the host and browse the
/ path. Click select all check box and saw all folders are seelcted. Click
Restore and then choose download tar.

It started to download and inform that it is done with size only ~120M .
Check inside the tar abll, the whole content of /usr ; /sbin and several
folders are empty. I can browser /usr using the web cgi ...

The second time I do not click select all check box, but only check the
folder /usr and do the same . NOw it is downloading 580M already and
continuing...

Why is that? is it a bug? Should I have to download for each foder at the
root level ?

right it is done the usr folder. Check inside the /usr/sbin is empty so
even with that it still miss files inside.





-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Strange behaviour when download the restore tar archive

2012-05-18 Thread Steve Kieu
Sorry false alarm. It is because my proxy server - it probably has a bug.
Turn off proxy in browser fixed the problem.



On Sat, May 19, 2012 at 12:23 PM, Steve Kieu msh.comput...@gmail.comwrote:

 Hi all,

 Today I did a test to see if I can restore the whole root / file system
 from backup pc. SO go to the cgi interface, browse the host and browse the
 / path. Click select all check box and saw all folders are seelcted. Click
 Restore and then choose download tar.

 It started to download and inform that it is done with size only ~120M .
 Check inside the tar abll, the whole content of /usr ; /sbin and several
 folders are empty. I can browser /usr using the web cgi ...

 The second time I do not click select all check box, but only check the
 folder /usr and do the same . NOw it is downloading 580M already and
 continuing...

 Why is that? is it a bug? Should I have to download for each foder at the
 root level ?

 right it is done the usr folder. Check inside the /usr/sbin is empty so
 even with that it still miss files inside.





 --
 Steve Kieu




-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Change Xfer type from ssh to rsyncd and can not browse backup

2012-05-15 Thread Steve Kieu
Hello everyone,

I have some host configureed before suing the ssh (rsync over ssh) and the
mount point di is   /  (root). It was backing up fine

Now I change to sue rsyncd module, - I created a module named rootfs and
path is / in the client. Then form existing host config click edit and
change to rsyncd and suing rootfs as rsync share name. All good,  the
backup runs fine.

but I no longer to browse to the new run (the old run is still browsable
though). The error is:

Error: Can't browse bad directory name /

If I go to the backupc pool , go to the pc/HOSTNAME/RUNNUMBER  and rename
the folder named  frootfs   to the same as the other run before which is
f%2f  - I can browse it again.

It does not happen with compltely new host using rsyncd.

How can I fix it - please help.

Many thanks in advance.







-- 
Steve Kieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] TIMEOUT on SMB causes missed files and undetected partial.

2012-02-25 Thread Steve
On Wed, Feb 22, 2012 at 10:09 AM, Pedro M. S. Oliveira 
pmsolive...@gmail.com wrote:

 Hello,
 I would advise you to use the rsync method on Windows as it's less tricky
 than smb.
 There's an how to at
 www.linux-geex.com
 Just search for backuppc on the search field.
 Cheers
 Pedro Oliveira
 On Feb 22, 2012 2:29 PM, Steve jellosk...@gmail.com wrote:



 On Mon, Feb 13, 2012 at 2:07 PM, Les Mikesell lesmikes...@gmail.comwrote:

 On Mon, Feb 13, 2012 at 1:33 PM, Steve jellosk...@gmail.com wrote:

 I am backing a Windows XP PC that is remote (Internet) and it takes
 several hours. I am selectively backing up a small amount of data.
 Somewhere in the middle of the backup, I start getting lines like these:

 NT_STATUS_CONNECTION_INVALID listing \ramu\*

 NT_STATUS_CONNECTION_INVALID listing \temp\*

 NT_STATUS_CONNECTION_INVALID listing \temp_wks_5418\*

 Once I get these lines, they seem to be continues for every subfolder and 
 the files do not get backed up.

 Additionally, those lines are always preceded by a line like this:

 Error reading file 
 \backup\Canon_2GB_SDC\dvd-backup\sees\msp430\MSP430_Console_IAR_STD\timerb.c
  : NT_STATUS_IO_TIMEOUT

 Which is sometimes a large file but often a small file.

 The worst part of this is that the backup misses the remaining data AND 
 gets flagged as Full and not Partial. Therefore Backuppc won't try to 
 continue or do any more with that PC until next week. See the 14917 Xfer 
 errs below.


 I've seen that happen where there was a duplex mismatch between the
 target host and its switch connection (one configured for 'full', one for
 auto, but  full means don't negotiate and auto means assume half if
 there is no negotiation).  Any bad network connection could probably cause
 it.  I'd agree that it should probably be interpreted as a more fatal error.

 --
Les Mikesell
   lesmikes...@gmail.com


 No new news on this. It still happens and I haven't been able to figure
 it out. The network topology is this:

 Windows PC  - (10.10.0.?) SonicWallVPN Client - Internet - SonicWall
 VPN Router (10.10.0.1) - BackupPC on Linux-x64 (10.10.0.12)


 --
 Virtualization  Cloud Management Using Capacity Planning
 Cloud computing makes use of virtualization - but cloud computing
 also focuses on allowing computing to be delivered as a service.
 http://www.accelacomm.com/jaw/sfnl/114/51521223/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


Thanks. I will check timeouts and also check on using rsync. I think rsync
would also do better for continuing partials, correct?
--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] TIMEOUT on SMB causes missed files and undetected partial.

2012-02-22 Thread Steve
On Mon, Feb 13, 2012 at 2:07 PM, Les Mikesell lesmikes...@gmail.com wrote:

 On Mon, Feb 13, 2012 at 1:33 PM, Steve jellosk...@gmail.com wrote:

 I am backing a Windows XP PC that is remote (Internet) and it takes
 several hours. I am selectively backing up a small amount of data.
 Somewhere in the middle of the backup, I start getting lines like these:

 NT_STATUS_CONNECTION_INVALID listing \ramu\*

 NT_STATUS_CONNECTION_INVALID listing \temp\*

 NT_STATUS_CONNECTION_INVALID listing \temp_wks_5418\*

 Once I get these lines, they seem to be continues for every subfolder and 
 the files do not get backed up.

 Additionally, those lines are always preceded by a line like this:

 Error reading file 
 \backup\Canon_2GB_SDC\dvd-backup\sees\msp430\MSP430_Console_IAR_STD\timerb.c 
 : NT_STATUS_IO_TIMEOUT

 Which is sometimes a large file but often a small file.

 The worst part of this is that the backup misses the remaining data AND gets 
 flagged as Full and not Partial. Therefore Backuppc won't try to 
 continue or do any more with that PC until next week. See the 14917 Xfer 
 errs below.


 I've seen that happen where there was a duplex mismatch between the target
 host and its switch connection (one configured for 'full', one for auto,
 but  full means don't negotiate and auto means assume half if there is
 no negotiation).  Any bad network connection could probably cause it.  I'd
 agree that it should probably be interpreted as a more fatal error.

 --
Les Mikesell
   lesmikes...@gmail.com


No new news on this. It still happens and I haven't been able to figure it
out. The network topology is this:

Windows PC  - (10.10.0.?) SonicWallVPN Client - Internet - SonicWall VPN
Router (10.10.0.1) - BackupPC on Linux-x64 (10.10.0.12)
--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Full backup locks up computer.

2012-02-21 Thread Steve Blackwell
On Thu, 16 Feb 2012 10:12:23 -0500
Steve Blackwell zep...@cfl.rr.com wrote:

8  snip

 The problem I'm having is that whenever I try to do a full backup, the
 computer locks up. There are no messages in any of the logs to
 indicate what might have caused the problem. Interestingly,
 incremental backups work OK. 

8  snip

Huh! After month of having this problem, last night it did a full
backup, unattended with no errors. Weird.

Steve

-- 
Changing lives one card at a time

http://www.send1cardnow.com


signature.asc
Description: PGP signature
--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Remote Mgmt

2012-02-17 Thread Steve Willoughby
On 17-Feb-12 12:35, Les Mikesell wrote:
 On Fri, Feb 17, 2012 at 2:24 PM, Zach Lanich zlan...@gmail.com
 mailto:zlan...@gmail.com wrote:
 How do I access my backuppc interface from outside the local
 network? I have a webserver set up through isp config and I can get
 to my website I built but isp gives me a 500 error when i try to
 access backuppc.


 The packaged apache config probably restricts access to the local host.
 In the EPEL rpm that would be in /etc/httpd/conf.d/BackupPC.conf, but
 the ubuntu version might be somewhere else.   If you can find it, change
 it to allow from all and restart apache (if you are sure you want to do
 that...).

If you are sure you want to do that...

Re-read Les' last sentence a few times and let it really sink in before 
going further with this.

BackupPC has essentially root-level access to all your backed-up 
systems, and certainly has access to all their file data (and can 
restore back onto them, probably).

Do you *really* feel confident that the way you log in to BackupPC over 
the web is secure?  Using only HTTPS?  Checking certificates properly? 
Even with that, you're 100% sure your webserver can't be compromised 
from the outside?

If you really need this, perhaps a better thing to do would be to SSH in 
to the host and set up a tunnel over that SSH connection to reach your 
BackupPC server.

-- 
Steve Willoughby / st...@alchemy.com
A ship in harbor is safe, but that is not what ships are built for.
PGP Fingerprint 4615 3CCE 0F29 AE6C 8FF4 CA01 73FE 997A 765D 696C

--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Full backup locks up computer.

2012-02-16 Thread Steve Blackwell
I have a fairly old computer, a ~6yr old dual 3.4GHz Pentium 4 that is
running Fedora 12. It's (past) time for an upgrade. I'm want to do a
clean install as the requirements for boot partition size have
increased and so I need a good complete backup before I start. 

The problem I'm having is that whenever I try to do a full backup, the
computer locks up. There are no messages in any of the logs to indicate
what might have caused the problem. Interestingly, incremental backups
work OK. Here's some output from the BackupPc log.

2012-02-16 01:00:00 Next wakeup is 2012-02-16 02:00:00
2012-02-16 01:00:10 Started full backup on steve (pid=5106, share=/)
2012-02-16 01:18:15 Finished  admin1  (BackupPC_nightly 128 255)
2012-02-16 01:18:19 BackupPC_nightly now running BackupPC_sendEmail
2012-02-16 01:18:42 Finished  admin  (BackupPC_nightly -m 0 127)
2012-02-16 01:18:42 Pool nightly clean removed 0 files of size 0.00GB
2012-02-16 01:18:42 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0
max links), 1 directories 2012-02-16 01:18:42 Cpool nightly clean
removed 0 files of size 0.00GB 
2012-02-16 01:18:42 Cpool is 181.90GB, 799760 files (63 repeated, 27
max chain, 2403 max links), 4369 directories 
2012-02-16 09:06:56 Reading hosts file 2012-02-16 09:06:56 BackupPC
started, pid 4161 
2012-02-16 09:06:57 Running BackupPC_trashClean (pid=4163) 
2012-02-16 09:06:57 Next wakeup is 2012-02-16 10:00:00 
2012-02-16 09:13:08 User steve requested backup of steve (steve)
2012-02-16 09:13:10 Started incr backup on steve (pid=4235, share=/)
2012-02-16 09:39:13 Finished incr backup on steve 
2012-02-16 09:39:13 Running BackupPC_link steve (pid=4373) 
2012-02-16 09:39:25 Finished steve (BackupPC_link steve) 
2012-02-16 10:00:01 Next wakeup is 2012-02-16 11:00:00

You can see that the full backup was initiated at 1am. At 9am the
computer was completely locked - screen on, ie screensaver had not
kicked in, no mouse or keyboard response so I rebooted and forced an
incremental. I thought it may be a problem with the HD but I forced an
e2fsck on boot and no change.

Ideas?

Thanks,
Steve 

PS. I'm using tar as the backup method and full backups used to work.
-- 
Changing lives one card at a time

http://www.send1cardnow.com


signature.asc
Description: PGP signature
--
Virtualization  Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] TIMEOUT on SMB causes missed files and undetected partial.

2012-02-13 Thread Steve
I am backing a Windows XP PC that is remote (Internet) and it takes several
hours. I am selectively backing up a small amount of data. Somewhere in the
middle of the backup, I start getting lines like these:

NT_STATUS_CONNECTION_INVALID listing \ramu\*

NT_STATUS_CONNECTION_INVALID listing \temp\*

NT_STATUS_CONNECTION_INVALID listing \temp_wks_5418\*

Once I get these lines, they seem to be continues for every subfolder
and the files do not get backed up.

Additionally, those lines are always preceded by a line like this:

Error reading file
\backup\Canon_2GB_SDC\dvd-backup\sees\msp430\MSP430_Console_IAR_STD\timerb.c
: NT_STATUS_IO_TIMEOUT

Which is sometimes a large file but often a small file.

The worst part of this is that the backup misses the remaining data
AND gets flagged as Full and not Partial. Therefore Backuppc won't
try to continue or do any more with that PC until next week. See the
14917 Xfer errs below.

0 
http://svn.css-design.com/backuppc/index.cgi?action=browsehost=nalininum=0fullXferLOG
http://svn.css-design.com/backuppc/index.cgi?action=viewtype=XferLOGnum=0host=nalini,
Errors 
http://svn.css-design.com/backuppc/index.cgi?action=viewtype=XferErrnum=0host=nalini14917000

Thoughts?
--
Try before you buy = See our experts in action!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-dev2___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc is Hammering My Clients

2012-01-31 Thread Steve
 On Tue, Jan 31, 2012 at 5:25 PM, Kimball Larsen quang...@gmail.com wrote:
 Is there anything I can to to have the backups run in a more transparent 
 manner?  We are not all that concerned with speed of backup process - we're 
 all here all day anyway, so as long as everyone gets a backup at least once 
 a day we're happy.

check the nice level during the backup (on the client) and see where
rsync is running.  If it's the same as the other user processes, maybe
change the $Conf{RsyncClientCmd} to include a nice level?  I know that
is suggested here:
http://backuppc.sourceforge.net/faq/ssh.html

A.

-- 
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc incremental taking up a lot of bandwidth with no additional changes

2012-01-20 Thread Steve Willoughby
On 20-Jan-12 12:07, Jeffrey J. Kosowsky wrote:
 Stefan Peter wrote at about 21:00:22 +0100 on Friday, January 20, 2012:
 On 01/20/2012 08:49 PM, Jeffrey J. Kosowsky wrote:
   In summary, I think you are trying to solve a problem that may not
   need to be solved, using a tool that is not meant to solve it, without
   understanding what is causing your problems and without knowing how
   the tool actually works in the first place :)
   
 can I use this sentence for my own purposes or do you have a copyright
 on it?

 Sure use it to your hearts content... but if anyone asks you where it
 came from, I just made it up on the spur of the moment...

That's beautiful.  And applicable to so many situations I deal with.

-- 
Steve Willoughby / st...@alchemy.com
A ship in harbor is safe, but that is not what ships are built for.
PGP Fingerprint 4615 3CCE 0F29 AE6C 8FF4 CA01 73FE 997A 765D 696C

--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] OT: NAS GUI for native Linux (preferably RHEL)

2012-01-16 Thread Steve Willoughby
On 16-Jan-12 14:31, Timothy J Massey wrote:
 Honest answer? My prejudice against non-Linux UNIX, especially with
 something as important as backup. I don't want to run into subtle issues

Fair enough, as long as you admit it's prejudice on your part.  There's 
a lot of history of Unix being proven stable in server applications for 
a lot longer than Linux has been around.  You have to set up and 
maintain the systems, though, so go with what you're comfortable with 
and know best how to use.

 that won't show themselves until I really, really need those backups...
 (That red text right up near the top of your link? *That* is what I'm
 talking about...)

Yeah, but don't rely on ANY platform to just work.  If you need the 
backups, always get in the habit of verifying them on a regular basis.

-- 
Steve Willoughby / st...@alchemy.com
A ship in harbor is safe, but that is not what ships are built for.
PGP Fingerprint 4615 3CCE 0F29 AE6C 8FF4 CA01 73FE 997A 765D 696C

--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The infamous backuppc_link got error -4

2012-01-06 Thread Steve
If you haven't stumbled on it yet, read this:
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory

Particularly if you have a version prior to 3.2, look in the section
Changing the name of the archive directory.

A.

On Fri, Jan 6, 2012 at 10:59 AM, Jim Durand jdur...@hrsg.ca wrote:
 Hey guys! Doing my best to get up to speed with Backuppc, pretty impressive
 so far. Logs are filled with “backuppc_link got error -4” errors, and from
 the research I have done it seems that because my TopDir (/mnt/sdb1) and the
 cpool location (/home/users/backuppc/data/cpool) are on different
 filesystems, hard links will obviously fail. Is the answer to soft link “ln
 –s /mnt/sdb1 /home/users/backuppc/topdir” and change TopDir in config.pl to
 the “/home/users/backuppc/topdir”? It can’t be that easy, right?





 Thanks!

 Jim


 --
 Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
 infrastructure or vast IT resources to deliver seamless, secure access to
 virtual desktops. With this all-in-one solution, easily deploy virtual
 desktops for less than the cost of PCs and save 60% on VDI infrastructure
 costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The infamous backuppc_link got error -4

2012-01-06 Thread Steve
On Fri, Jan 6, 2012 at 11:52 AM, Steve lepe...@gmail.com wrote:
 If you haven't stumbled on it yet, read this:
 http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory

 Particularly if you have a version prior to 3.2, look in the section
 Changing the name of the archive directory.

 A.

Apologies about the top-post, was a goofup on my part...was in reply
to Jim's quesiton below:

 On Fri, Jan 6, 2012 at 10:59 AM, Jim Durand jdur...@hrsg.ca wrote:
 Hey guys! Doing my best to get up to speed with Backuppc, pretty impressive
 so far. Logs are filled with “backuppc_link got error -4” errors, and from
 the research I have done it seems that because my TopDir (/mnt/sdb1) and the
 cpool location (/home/users/backuppc/data/cpool) are on different
 filesystems, hard links will obviously fail. Is the answer to soft link “ln
 –s /mnt/sdb1 /home/users/backuppc/topdir” and change TopDir in config.pl to
 the “/home/users/backuppc/topdir”? It can’t be that easy, right?

A.

--
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Steve
On Fri, Dec 16, 2011 at 4:42 AM, Jean Spirat jean.spi...@squirk.org wrote:
 The issue is that we are now at 100GB of data
 and 8.030.000 files so the backups takes 48H and more (to help the files
 are on NFS share). I think i come to the point where file backup is at
 it's limit.

What about a script on this machine with all the files that uses tar
to put all (or some, or groups) these little files into a few bigger
files, stored in a separate directory?  Run your script a few times a
day and just exclude the directories with gazillions of files and
backup the directory you created that has the tar archives in them.

Steve

-- 
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Upgrade on Ubuntu

2011-12-10 Thread Steve
On Sat, Dec 10, 2011 at 7:06 AM, Arnold Krille arn...@arnoldarts.de wrote:
 And then update that 8.04 to 10.04 to get new features, more security fixes 
 and
 less trouble when 12.04 arrives...

Honestly, once you get the current thing functional, get another
machine and clean-install everything yourself with the latest and
greatest LTS.  When that new install is up, running, and verified to
restore everything you want the right way you can take the old one
offline (keep it's archives as long as you need though).

If backup PC was this messed up, do you trust the rest of the machine?
 I wouldn't.

A.

--
Learn Windows Azure Live!  Tuesday, Dec 13, 2011
Microsoft is holding a special Learn Windows Azure training event for 
developers. It will provide a great way to learn Windows Azure and what it 
provides. You can attend the event by watching it streamed LIVE online.  
Learn more at http://p.sf.net/sfu/ms-windowsazure
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cpool always empty?

2011-11-21 Thread Steve
On Mon, Nov 21, 2011 at 3:45 PM, Bob Proulx b...@proulx.com wrote:
 Update: I just now found this page that I had not found previously
 even though I had looked for that type of page.  It doesn't seem to be
 very well linked into the upper documentation.

  http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory

In Changing the name of the archive directory, the part about
changing Lib.pm is critical for the pre-3.2 version and easily missed.

A.

--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclusion not working with rsync

2011-11-17 Thread Steve M. Robbins
On Fri, Nov 18, 2011 at 12:08:23AM +0100, Holger Parplies wrote:

 Actually, it's
 
 $Conf {BackupFilesExclude} = {
   '/home' = [
   '/steve/Packages'
   ]
 };
 
 presuming you *only* want to exclude /home/steve/Packages and not [...]

Thanks -- that is indeed what I wanted.

-Steve


signature.asc
Description: Digital signature
--
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 65280 () status

2011-11-02 Thread Steve
I've accidentally changed universal backup settings instead of an
individual machine before when adding a new machine...I'd check that
first.  The new machine works fine but some other starts messing up
because it was using the old default setting which I messed up...
evets

On Wed, Nov 2, 2011 at 9:57 AM, Joe Konecny jkone...@rmtohio.com wrote:
 I know people are tired of talking about this error from the responses I've 
 seen on google
 when researching it but it cropped up for me now.

 I have been running successfully for quite some time under Ubuntu 11.04 and 
 BackupPC 3.1.0.
 I back up an Ubuntu 11.04 server (named r4p17).

 Yesterday I added a windows client to using cygwin-rsync per the 
 documentation.  The backup
 worked fine.  I changed nothing on r4p17 and nothing on the backuppc server 
 configuration.
 Today I received an email...

 The following hosts had an error that is probably caused by a
 misconfiguration.  Please fix these hosts:
   - 10.0.0.8 (No files dumped for share /)

 Regards,
 PC Backup Genie

 ...and the error in the log is...

 Running: /usr/bin/ssh -q -x -n -l root 10.0.0.8 env LC_ALL=C /bin/tar -c -v 
 -f - -C / --totals --exclude=./sys
 --exclude=./proc .
 full backup started for directory /
 Xfer PIDs are now 9123,9122
 Tar exited with error 65280 () status
 tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 
 filesTotal, 0 sizeTotal
 Got fatal error during xfer (No files dumped for share /)
 Backup aborted (No files dumped for share /)
 Not saving this as a partial backup since it has fewer files than the prior 
 one (got 0 and 0 files versus 0)

 ...I'm not sure how adding a windows client affected backing up r4p17 and I'm 
 not totally sure that is the
 reason but I am suspicious.  Can anyone lend some insight on what to check?

 --
 RSA#174; Conference 2012
 Save $700 by Nov 18
 Register now#33;
 http://p.sf.net/sfu/rsa-sfdev2dev1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
RSA#174; Conference 2012
Save $700 by Nov 18
Register now#33;
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to choose files to backup?

2011-10-30 Thread Steve M. Robbins
On Sun, Oct 30, 2011 at 12:03:52AM -0500, John Smith wrote:
 I have Backuppc running on Debian Lenny.  I can see that backuppc is backing 
 up /etc to /var/lib/backuppc
 but where in the configuration is /etc set as the directory to be backed up? 

The configuration is all in /etc/backuppc.

You can point your web browser to http://localhost/backuppc/ to
view/edit the config.

-Steve



signature.asc
Description: Digital signature
--
Get your Android app more play: Bring it to the BlackBerry PlayBook 
in minutes. BlackBerry App World#153; now supports Android#153; Apps 
for the BlackBerryreg; PlayBook#153;. Discover just how easy and simple 
it is! http://p.sf.net/sfu/android-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] File Counts do not add up

2011-10-29 Thread Steve M. Robbins
Hi,

I'm puzzled by the File Size/Count Reuse Summary section of the web
interface.

This table includes Total Number of Files, Number of Existing
Files, and Number of New Files.  I had expected the first to be the
sum of the second two, but that's not the case.  The sum of Existing
and New files always exceeds Total Number of Files.

For example, a recent incremental claims:
Total: 680
Existing: 503
New: 258

How is this possible?
-Steve


signature.asc
Description: Digital signature
--
Get your Android app more play: Bring it to the BlackBerry PlayBook 
in minutes. BlackBerry App World#153; now supports Android#153; Apps 
for the BlackBerryreg; PlayBook#153;. Discover just how easy and simple 
it is! http://p.sf.net/sfu/android-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclusion not working with rsync

2011-10-27 Thread Steve M. Robbins
On Wed, Oct 26, 2011 at 07:49:48PM -0500, Steve M. Robbins wrote:

 I'm backing up my local machine using rsync; see configuration below.
 Despite the exclusion, I still get /home/steve/Packages in my backup.
 I tried also '/home/steve/Packages/*' with the same result.  What's
 the magic?

Well, experimentation indicates that I need to strip off
the prefix; i.e. change from

 $Conf{BackupFilesExclude} = {
   '/home' = [
 '/home/steve/Packages'
   ]
 };

to

 $Conf{BackupFilesExclude} = {
   '/home' = [
 'steve/Packages'
   ]
 };


Regards,
-Steve


signature.asc
Description: Digital signature
--
The demand for IT networking professionals continues to grow, and the
demand for specialized networking skills is growing even more rapidly.
Take a complimentary Learning@Cisco Self-Assessment and learn 
about Cisco certifications, training, and career opportunities. 
http://p.sf.net/sfu/cisco-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Exclusion not working with rsync

2011-10-26 Thread Steve M. Robbins
Hi,

I'm backing up my local machine using rsync; see configuration below.
Despite the exclusion, I still get /home/steve/Packages in my backup.
I tried also '/home/steve/Packages/*' with the same result.  What's
the magic?


#
# Local server backup of /etc as user backuppc
#
$Conf{XferMethod} = 'rsync';

$Conf{RsyncShareName} = [
  '/etc',
  '/boot',
  '/home',
  '/var/mail',
  '/var/www',
  '/sound+vision'
];

$Conf{BackupFilesExclude} = {
  '/home' = [
'/home/steve/Packages'
  ]
};


Thanks,
-Steve


signature.asc
Description: Digital signature
--
The demand for IT networking professionals continues to grow, and the
demand for specialized networking skills is growing even more rapidly.
Take a complimentary Learning@Cisco Self-Assessment and learn 
about Cisco certifications, training, and career opportunities. 
http://p.sf.net/sfu/cisco-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Unattended off-site replication

2011-10-25 Thread Steve M. Robbins
On Mon, Oct 24, 2011 at 11:21:26PM -0500, Les Mikesell wrote:
 On Mon, Oct 24, 2011 at 10:19 PM, Steve M. Robbins st...@sumost.ca wrote:
 
  I know many people have discussed how to achive an offsite archive of
  backuppc pool.  During a discussion last February [1], Timothy Massey [2]
  and Jeffrey Kosowsky [3] summarized the options as follows:
 
  1) Run two BackupPC servers and have both back up the hosts
    directly.  No replication at all:  it just works.
  2) Use some sort of block-based method of replicating the data
  3) Scripts that understand the special structure of the pool and pc
    trees and efficiently create lists of all hard links in pc directory.
 
  I'll be replicating over a thin residential ISP connection (rules out
  option #1)
 
 Unless you have several hosts that hold duplicate data, after you get
 the initial fulls option #1 with rysnc transport over ssh or a vpn
 with compression enabled won't be moving more data than other ways you
 might attempt it.

At the risk of exposing my ignorance of BackupPC internals, I don't see
how this is possible.  For a full back-up, isn't it true that all the
files are transferred to the backup host, then compared to the pool?

One host of mine has 1.4M files totalling 550 GB, but the last full
backup recorded 1400 new files totalling 57GB.  Thus option #1 would
transfer all 550 GB, whereas my proposal would transfer a tenth of
that.  No?

-Steve



signature.asc
Description: Digital signature
--
The demand for IT networking professionals continues to grow, and the
demand for specialized networking skills is growing even more rapidly.
Take a complimentary Learning@Cisco Self-Assessment and learn 
about Cisco certifications, training, and career opportunities. 
http://p.sf.net/sfu/cisco-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Unattended off-site replication

2011-10-24 Thread Steve M. Robbins
Hi,

I know many people have discussed how to achive an offsite archive of
backuppc pool.  During a discussion last February [1], Timothy Massey [2] 
and Jeffrey Kosowsky [3] summarized the options as follows:

1) Run two BackupPC servers and have both back up the hosts 
   directly.  No replication at all:  it just works.
2) Use some sort of block-based method of replicating the data
3) Scripts that understand the special structure of the pool and pc
   trees and efficiently create lists of all hard links in pc directory.

I'll be replicating over a thin residential ISP connection (rules out
option #1) and I want it to be completely unattended (no option #2).
As for Option #3, I tried J. Kosowsky's script BackupPC_copyPcPool but
stopped it after 12 hours without completing.

One thing that all these methods have in common is that they scan the 
entire pool filesystem.  I accept that I will have to do that at
least initially.  However, to send daily updates, it seems unnecessary
to re-scan the filesytem again when backuppc itself already computes the
information needed:

* the set of files added to the pool
* the set of hardlinks in __TOPDIR__/pc/$host/$backup
* the set of files expired

It strikes me that backuppc could be taught to write all this out to
one or more journal files that could be replayed on the remote system
after the new files are transferred.

Does this make sense?  Has anyone investigated this approach?

Thanks,
-Steve


[1] 
http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg20839.html
[2] 
http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg20853.html
[3] 
http://www.mail-archive.com/backuppc-users@lists.sourceforge.net/msg20854.html


signature.asc
Description: Digital signature
--
The demand for IT networking professionals continues to grow, and the
demand for specialized networking skills is growing even more rapidly.
Take a complimentary Learning@Cisco Self-Assessment and learn 
about Cisco certifications, training, and career opportunities. 
http://p.sf.net/sfu/cisco-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Steve
Don't know if it's faster than your way or not, but I've used:
find -type f -name *thing_i_want
note you can use wildcards...

a.

On Wed, Sep 28, 2011 at 10:52 AM, Gerald Brandt g...@majentis.com wrote:

 Hi Tim,

 That's basically what I did, but I have a couple of BackupPC users that have 
 no clue about command line stuff, so I was hoping for a BackupPC web based 
 solution.

 Gerald


 

 From: Timothy J Massey tmas...@obscorp.com
 To: General list for user discussion, questions and support 
 backuppc-users@lists.sourceforge.net
 Sent: Wednesday, September 28, 2011 9:30:18 AM
 Subject: Re: [BackupPC-users] Search for File

 Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:

  I need to search for a specific file on a host, via backuppc.  Is
  there a way to search a host backup, so I don't have to manually go
  through all directories via the web interface?

 The easiest, most direct way of doing that would be:

 cd /path/to/host/pc/directory
 find . | grep ffilename

 I'm sure someone with more shell-fu will give you a much better command line 
 (and I look forward to learning something!).  I'm sure there's a way to do it 
 simply with the find command alone, but I've had limited success trying to 
 limit the find command to find specific files.  For me, it's easier to use 
 grep as above.  My way will work, if a bit slowly:  there's lots of files in 
 there...

 Don't forget the leading f in the filename:  BackupPC puts an f in front of 
 every filename in the directory structure.

 Tim Massey

 Out of the Box Solutions, Inc.
 Creative IT Solutions Made Simple!
 http://www.OutOfTheBoxSolutions.com
 tmas...@obscorp.com       22108 Harper Ave.
 St. Clair Shores, MI 48080
 Office: (800)750-4OBS (4627)
 Cell: (586)945-8796

 --
 All the data continuously generated in your IT infrastructure contains a
 definitive record of customers, application performance, security
 threats, fraudulent activity and more. Splunk takes this data and makes
 sense of it. Business sense. IT sense. Common sense.
 http://p.sf.net/sfu/splunk-d2dcopy1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


 --
 All the data continuously generated in your IT infrastructure contains a
 definitive record of customers, application performance, security
 threats, fraudulent activity and more. Splunk takes this data and makes
 sense of it. Business sense. IT sense. Common sense.
 http://p.sf.net/sfu/splunk-d2dcopy1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




--
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Too many links : where is the problem ?

2011-08-08 Thread Steve
On Mon, Aug 8, 2011 at 6:39 PM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
 Finally, out of curiosity, I grepped the BackupPC code base for the
 error language too many links cited verbatimu by the OP and found that such
 a phrase only occurs in the comments and hence is not even a valid
 error code...

That error is almost certainly directly from rsync, not backuppc.  I
saw it many times trying to use rsync to duplicate backuppc.  I never
figured out what caused it, since I never exceeded the number of
links.  I just gave up duplicating that way :)

Evets


-- 
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
uberSVN's rich system and user administration capabilities and model 
configuration take the hassle out of deploying and managing Subversion and 
the tools developers use with it. Learn more about uberSVN and get a free 
download at:  http://p.sf.net/sfu/wandisco-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] serial or parallel backups

2011-06-26 Thread Steve
I just depends on any particular setup, and what the most limiting
factor is.  Sound like for you it's bandwidth, but that may not always
be true, even for you, depending on how much your data changes from
backup to backup and things like that.

I would argue that if you have so few machines that you can run them
one-at-a-time, you should, and why not?  It's clearly faster for you
and in addition, keeps the server less loaded and the network less
loaded.  What possible advantage could there be to run them together?

E.

On Sun, Jun 26, 2011 at 4:48 PM, Chris Baker cba...@intera.com wrote:
 I have been wondering about this for a while. Am I better off having
 backups run parallel or in series?

 By running in series, I mean one backup runs at a time. When it finishes,
 another one starts.

 By running parallel, I mean that several backups run at once. It seems
 that when backups have to fight over bandwidth, they all end up running
 much more slowly. I have it set up to run four backups at once.

 A server that rarely runs more than one back has achieved throughput as
 high as 24.13 MB/sec. However, the server with four backups has a maximum
 of only 5.71 MB/sec. Bottom line, the four when added up still don't get
 as good a throughput as the single backup.

 What does everyone here think?

 Chris Baker
 cba...@intera.com
 512-425-2006

 --
 All of the data generated in your IT infrastructure is seriously valuable.
 Why? It contains a definitive record of application performance, security
 threats, fraudulent activity, and more. Splunk takes this data and makes
 sense of it. IT sense. And common sense.
 http://p.sf.net/sfu/splunk-d2d-c2
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
It turns out there is considerable overlap between the smartest bears
and the dumbest tourists.

--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] [newb] ssh rsync with restricted permissions

2011-04-05 Thread Steve
I'm deliberately top-posting to ask, did you setup everything the
standard way and get it working?  If not, try that first and then
start changing things.  The above (below) suggestion may simply be
failing due to some other setup issue, not the security issue that
concerns you.  And I am not expert enough to diagnose much at all, and
certainly not a non-standard setup :)

 Can really nobody help me out, or should I start a new subject?

Uh, there were 6-7 suggestions/replies.  We're trying.

A.

On Wed, Mar 30, 2011 at 5:45 PM, yilam backuppc-fo...@backupcentral.com wrote:
 Well I tried your setup (need I say I am new to backuppc?) with on the client:

 * /etc/sudoers:
 Cmnd_Alias      BACKUP = /usr/bin/rsync --server --daemon *
 buclient          my-host = NOPASSWD: BACKUP

 * ~buclient/.ssh/authorized_keys2
 no-pty,no-agent-forwarding,no-X11-forwarding,no-port-forwarding,command=sudo 
 /usr/bin/rsync --server --daemon --config=/etc/rsyncd.conf . ssh-rsa 
 B

 * /etc/rsyncd.conf
 uid = root
 pid file = /var/lib/buclient/run/rsyncd.pid
 use chroot = no
 read only = true
 transfer logging = true
 log format = %h %o %f %l %b
 syslog facility = local5
 log file = /var/lib/buclient/log/rsyncd.log
 [fullbackup]
        path = /var/log/exim4
        comment = backup

 From the server (backuppc machine), I can do the following:

 /usr/bin/rsync -v -a -e /usr/bin/ssh -v -q -x -2 -l buclient -i 
 /var/lib/backuppc/.ssh/id_rsa backuppc@192.168.1.1::fullbackup /tmp/TEST

 However, I have not found the correct $RsyncClientCmd to use, for backuppc to 
 work. The following value
 $Conf{RsyncClientCmd} = '$sshPath -q -x -l buclient -i 
 /var/lib/backuppc/.ssh/id_rsa.backuppc_casiopei $host $rsyncPath $argList+';

 Gives me (using /usr/share/backuppc/bin/BackupPC_dump -v -f 192.168.1.1):
 [...]
 full backup started for directory fullbackup
 started full dump, share=fullbackup
 Error connecting to rsync daemon at 192.168.1.1:22: unexpected response 
 SSH-2.0-OpenSSH_5.1p1 Debian-5

 Got fatal error during xfer (unexpected response SSH-2.0-OpenSSH_5.1p1 
 Debian-5
 )
 [...]

 And on the client, I have, in /var/log/auth.log:
 Mar 30 23:35:22 my-host sshd[1389]: Bad protocol version identification 
 '@RSYNCD: 28' from 192.168.1.22

 Any ideas on how to get this to work (BTW, server is Debian/Squeeze, client 
 is Debian/Lenny).

 Thank you

 tom

 +--
 |This was sent by sneak...@gmx.net via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--



 --
 Create and publish websites with WebMatrix
 Use the most popular FREE web apps or write code yourself;
 WebMatrix provides all the features you need to develop and
 publish your website. http://p.sf.net/sfu/ms-webmatrix-sf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
It turns out there is considerable overlap between the smartest bears
and the dumbest tourists.

--
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] CPOOL, PC directories and backuppc statistics generation -- moving cpool possibly?

2011-03-18 Thread Steve
On Fri, Mar 18, 2011 at 1:37 PM, Les Mikesell lesmikes...@gmail.com wrote:
 On 3/18/2011 12:06 PM, Scott wrote:
 I am trying to figure out why my backuppc statistics are not generating
 (everything is basically all 0's in the status page -- see below).

 What version are you running?  I thought that before 3.2 you had to
 either reinstall from the sourceforge tarball (where a part of the
 install script lets you set the archive location where you want it) or
 you had to either mount or symlink a new partition into the old location
 (/var/lib/backuppc, for distribution-packaged versions).  There are some
 instructions for that here:
 http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory

There is a way to edit the perl; just takes one line of edit, on the
old systems and that will fix everything too.  See the section
Changing the name of the archive directory at this link:
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory#Changing_the_name_of_the_archive_directory

I thought this was fixed in new releases so the perl did not need to
be edited, but i am still running the old version.
steve

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] speed up backups

2010-05-26 Thread Steve
it's in the documentation - click on documentation over on the left of
the web interface and search for checksum;
the section is rsync checksum caching
evets

On Wed, May 26, 2010 at 9:34 AM, Sorin Srbu sorin.s...@orgfarm.uu.se wrote:
-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com]
Sent: Wednesday, May 26, 2010 2:55 PM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] speed up backups

After the 1st 2 fulls, rsync should be better if you have enabled checksum
caching.  You do need plenty of RAM to hold the directory listing if you
 have a
large number of files.

 That was the checksum= 31thousandsomething to be added somewhere. I need to
 find that mail in the archives...

 --
 /Sorin

 --


 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/





-- 
It turns out there is considerable overlap between the smartest bears
and the dumbest tourists.

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Syntax for excluding .gvfs

2010-05-13 Thread Steve Blackwell
On Wed, 12 May 2010 12:36:47 -0400
Steve Blackwell zep...@cfl.rr.com wrote:

 On Wed, 12 May 2010 10:48:03 -0500
 Les Mikesell lesmikes...@gmail.com wrote:
 
 
  A quick fix might be to add the --one-file-system option to the tar 
  command so it will ignore all mount points.  I always do that anyway
  to avoid picking up ad-hoc DVD/USB or network mounts that might
  happen to be active, but then you have to be sure you explicitly add
  every filesystem that you do want to back up.
  
 
 I have a dual boot system and I keep all my music on the Windows side
 so that I can access it from either. So I want to back up the Windows
 stuff too. As I understand it the .gvfs directory has to do with fuse
 filesystems, ie Windows in my case. The Windows drive is mounted
 on /mnt. If I use the --one-file-system option and add /mnt/c_drive to
 my TarShareName, am I not going to have the same problem?
 
 I guess I'll try it and see.
No luck. Ignoring the fact I didn't get all the shares/excludes right, I
still get the .gvfs error.

Running: /usr/bin/sudo /bin/tar -c -v -f - -C / --totals
--one-file-system --newer=2010-05-06 00:00:09 --exclude=./proc
--exclude=./sys --exclude=./tmp --exclude=./media --exclude=./selinux
--exclude=./misc --exclude=./home/*/.gvfs . 
incr backup started back to 2010-05-06 00:00:09 (backup #284) for
directory / 
Xfer PIDs are now 29228,29227 
/bin/tar: ./boot/: file is on a different filesystem; not dumped 
[ skipped 6882 lines ] 
/bin/tar: ./dev/: file is on a different filesystem; not dumped
[ skipped 134 lines ] 
/bin/tar: ./home/steve/.gvfs: Cannot stat: Permission denied 
[ skipped 4481 lines ] 
/bin/tar: ./mnt/c_drive/: file is on a different filesystem; not dumped
[ skipped 27693 lines ] 
/bin/tar: ./net/: file is on a different filesystem; not dumped
[ skipped 304 lines ] 
/bin/tar: Exiting with failure status due to previous errors Tar exited
with error 512 () status 
[ skipped 32 lines ] 
tarExtract: Done: 0 errors, 3843 filesExist, 18817093 sizeExist,
3858427 sizeExistComp, 10576 filesTotal, 7797716344 sizeTotal 
Got fatal error during xfer (Tar exited with error 512 () status)
Backup aborted (Tar exited with error 512 () status)

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Syntax for excluding .gvfs

2010-05-12 Thread Steve Blackwell
On Thu, 6 May 2010 17:38:46 -0400
Steve Blackwell zep...@cfl.rr.com wrote:

 I'm running F11 and I keep getting an error on my backups. I've
 tracked this down to a file, .gvfs, in users home directories. From
 the log file:
 
 /bin/tar: ./.gvfs: Cannot stat: Permission denied
 
 I use the web interface to edit the configuration but I cannot find
 the correct syntax to exclude these files. The TarShareName is / and
 I've tried the following for BackupFilesExclude
 
 .gvfs
 *.gvfs
 /home/*/.gvfs
 
 none of which work. How have others overcome this? And, yes, I do have
 the Override box checked.
 
 Thanks,
 Steve

I still don't have a solution for this and because of it all my backups
fail. Now I think my HD may be on its way out becaus of errors I'm
seeing in logwatch:
 /dev/sdb :
Prefailure: Seek_Error_Rate (7) changed to 
  200, 100, 200, 100, 

I did learn that you can't use an unquoted * in the exclude list.

Really need help and for those who did not read the original thread I'm
using 

BackupPC-3.1.0-9.fc11.noarch

and my computer.pl file looks like this:

$Conf{TarClientCmd} = '/usr/bin/sudo $tarPath -c -v -f - -C $shareName
--totals'; 
$Conf{TarFullArgs} = '$fileList';
$Conf{TarClientRestoreCmd} = '/usr/bin/sudo $tarPath -x -p
--numeric-owner --same-owner -v -f - -C $shareName';
$Conf{TarClientPath} = '/bin/tar'; 
$Conf{BackupFilesExclude} = {
  '/' = [
'/proc',
'/sys',
'/tmp',
'/media',
'/selinux',
'/misc',
'/home/*/.gvfs'
  ]
};
$Conf{TarIncrArgs} = '--newer=$incrDate $fileList';
$Conf{TarShareName} = [
  '/'
];
$Conf{BlackoutPeriods} = [
  {
'hourEnd' = '23.5',
'weekDays' = [
  '0',
  '1',
  '2',
  '3',
  '4',
  '5',
  '6'
],
'hourBegin' = '8.5'
  }
];

Thanks,
Steve

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Syntax for excluding .gvfs

2010-05-12 Thread Steve Blackwell
On Wed, 12 May 2010 09:29:32 -0400
Bowie Bailey bowie_bai...@buc.com wrote:


 
 What version of tar do you have?
 
 From the manual:
 Note that GNU tar version = 1.13.7 is required for the exclude
 option to work correctly.
 
[st...@steve ~]$ tar --version
tar (GNU tar) 1.22
...

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] exclude not working

2010-05-12 Thread Steve Blackwell
On Wed, 12 May 2010 10:01:15 -0400
Mark Maciolek m...@sr.unh.edu wrote:

 Hi,
 
 Backuppc 3.1 using rsync
 
 $Conf{RsyncShareName} = [
'/raid1'
 
 $Conf{BackupFilesExclude} = {
'*' = [
  '/raid1/*/tilecache',
  '/raid1/temp',
  '/raid1/osmplanet'
]
 
 
 
 NewFileList still shows
   2a85606be4fcc25d294720dfb67d183b 3657 
 f%2fraid1/fcaribbean/ftilecache/fNOAA_ENCT/f14/f000/f005/f746/f000/f009/f048.png
 2a85606be4fcc25d294720dfb67d183b 3657 
 f%2fraid1/fcaribbean/ftilecache/fNOAA_ENCT
 
 
 Do I need to specify /raid1/caribbean/tilecache ?
 
 Mark
Hi Mark,

I, too, have been having problems with exclude. See the (long) thread
titled Syntax for excluding .gvfs. 

What I discovered is that if you use the wildcard, *, it must be quoted
otherwise all the excludes are ignored so 'raid1/*/tilecache' should be
'raid1/*/tilecache'.

Also, though I'm not 100% sure about this, 

$Conf{BackupFilesExclude} = {
'*' = [
...

should be 

$Conf{BackupFilesExclude} = {
'/raid1' = [
...

HTH
Steve

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Syntax for excluding .gvfs

2010-05-12 Thread Steve Blackwell
On Wed, 12 May 2010 10:48:03 -0500
Les Mikesell lesmikes...@gmail.com wrote:


 A quick fix might be to add the --one-file-system option to the tar 
 command so it will ignore all mount points.  I always do that anyway
 to avoid picking up ad-hoc DVD/USB or network mounts that might
 happen to be active, but then you have to be sure you explicitly add
 every filesystem that you do want to back up.
 

I have a dual boot system and I keep all my music on the Windows side
so that I can access it from either. So I want to back up the Windows
stuff too. As I understand it the .gvfs directory has to do with fuse
filesystems, ie Windows in my case. The Windows drive is mounted
on /mnt. If I use the --one-file-system option and add /mnt/c_drive to
my TarShareName, am I not going to have the same problem?

I guess I'll try it and see.

Thanks, 
Steve

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Syntax for excluding .gvfs

2010-05-09 Thread Steve Blackwell
On Sat, 8 May 2010 23:26:40 +0100
Luis Paulo luis.bar...@gmail.com wrote:

 On Sat, May 8, 2010 at 7:49 PM, Steve Blackwell zep...@cfl.rr.com
 wrote:
  On Sat, 8 May 2010 17:35:47 +0100
 [..]
  Contents of file /media/disk/pc/steve/XferLOG.bad.z, modified
  2010-05-08 14:41:36
 
  Running: /usr/bin/sudo /bin/tar -c -v -f - -C /home/steve --totals
  --newer=2010-05-06 00:00:09 . incr backup started back to 2010-05-06
  00:00:09 (backup #284) for directory /home/steve Xfer PIDs are now
  30316,30315 /bin/tar: ./.gvfs: Cannot stat: Permission denied
  ...
  rest of the transfer log.
 [...]
 
 Well, something is not right

I won't disagree with you there!

 You should have an --exclude=./.gvfs on your Running: line
 as in (for /proc)
 Running: /usr/bin/env LC_ALL=C sudo /bin/tar -c -v -f - -C / --totals
 --newer=2010-05-07 04:00:03 --exclude=./proc/* ...

As I understand it, the variable $fileList is created based on the
includes and excludes or more specifically based on BackupFilesOnly or
BackupFilesExclude.

TarFullArgs is set to $fileList and 
TarIncrArgs is set to --newer=$incrDate $fileList 

and so you won't see a --exclude switch in the log. 

Someone, please correct me if I'm wrong.

Steve

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


  1   2   >