Re: [BackupPC-users] Backing up /home from a cPanel Server

2020-02-26 Thread Nick Bright
I know this isn't very helpful, but it's a data point: my cPanel servers 
back up /home without any special configuration in either BackupPC or 
the cPanel server.


On 2/20/2020 3:07 PM, Way3.com Info wrote:

Jeff,

Thanks but unfortunately, that did not work.  ☹

Any other suggestions?

Thanks!



-Original Message-
From: Jeffrey West 
Sent: Thursday, February 20, 2020 1:42 PM
To: i...@way3.com; General list for user discussion, questions and support 

Subject: RE: [BackupPC-users] Backing up /home from a cPanel Server

I am wondering it SELinux might be affecting the backup.  On the CPanel server 
have you tried running

setenforce 0

Then try another backup and see if anything changes.   If not, then you
can turn it back on with

setenforce 1

Jeff West
Systems Engineer
Riptide Software
Office 321-296-7724 ext 216
Direct 407-542-7697
Cell407-925-7030
www.riptidesoftware.com

-Original Message-
From: Way3.com Info 
Sent: Thursday, February 20, 2020 2:38 PM
To: backuppc-users@lists.sourceforge.net
Subject: [BackupPC-users] Backing up /home from a cPanel Server

My backuppc application is running without issues.

I have recently added a new server to be backed up via Backuppc.  This new 
server is a cPanel server.  All my other servers on backuppc are not cPanel.

I am able to connect and able to backup all of the system files on the new 
server, etc, but it will not backup the contents of /home

Here are the contents of the log file:

2020-02-20 13:35:21 incr backup started back to 2020-02-20 10:50:36 (backup
#2) for directory /
2020-02-20 13:38:34 incr backup started back to 2020-02-20 10:50:36 (backup
#2) for directory /home/
2020-02-20 13:38:34 incr backup 3 complete, 37 files, 2547998 bytes, 438 
xferErrs (0 bad files, 0 bad shares, 438 other)

I am using rsync to do the backup.

Any suggestions?

Thanks!



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



--
--- Email Signature
    
Nick Bright - VICE PRESIDENT OF TECHNOLOGY
P: 888.332.1616 	W: valnet.net <https://valnet.net> 	F: Facebook 
<https://www.facebook.com/ValnetISP/>



This email and any files transmitted with it are confidential and 
intended solely for the use of the individual or entity to whom they are 
addressed. If you have received this email in error please notify the 
system manager. This message contains confidential information and is 
intended only for the individual named. If you are not the named 
addressee you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately by e-mail if you have received this 
e-mail by mistake and delete this e-mail from your system. If you are 
not the intended recipient you are notified that disclosing, copying, 
distributing or taking any action in reliance on the contents of this 
information is strictly prohibited.
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Server Hardware

2019-11-01 Thread Nick Bright
I run an HP DL160G6se with hardware RAID and 7200RPM LFF SAS, with dual 
power supplies & SSD for OS.


Backing up 34 linux hosts with a pool of 10.8TB prior to pooling & 
compression. It typically stays busy 24/7 between backups & pool 
cleanup, etc.


On 11/1/2019 9:17 AM, Greg Harris wrote:
How spec’d out do you make your backup server machines, as in the ones 
running your backup software to backup the production servers?  Are 
they barely above desktop class?  Software RAID or still require 
hardware RAID?  Still SAS drives or do you go SATA and expect to 
replace them?  Hot swappable or do you just expect to take down the 
machine?  Dual power supplies or just a single as its the backup server?


Thanks,

Greg Harris



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



--
    
Nick Bright
Vice President of Technology
A:  

520 W Sycamore, Independence, KS 67301

P:  888.332.1616 ext.315 


E:  nick.bri...@valnet.net <mailto:nick.bri...@valnet.net>




valnet.net <https://valnet.net>
<https://www.facebook.com/ValnetISP/> 	<https://twitter.com/ValnetISP> 
<https://www.youtube.com/channel/UC51F8Q0s8BZJJugyiH64fMw?view_as=subscriber> 
	<https://www.instagram.com/valnet.isp/>


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] offsite server

2018-04-10 Thread Nick Bright
I would agree. Backing up the backup doesn't help if the primary backup 
fails in some of the ways described in this thread; a far better 
solution (from a points of failure perspective) is to have two 
independent backups systems backing up the client.


Just my $0.02, I am not an expert in these things; just a normal user.

On 4/10/2018 9:35 AM, fr...@rams.colostate.edu wrote:
Depending on the size of your environment, it may be impractical to 
try to rsync the pool.


Our BackupPC server covers about 120 systems and uses >4TB (and 100M 
inodes) to do so. Such a file system would not rsync quickly to a 
remote location.


I believe the best thing to do is to have a Local BackupPC server to 
backup EVERYTHING (Prod, non-Prod, experiemental, etc)
and a remote BackupPC instance that independently backs up everything 
you can't afford to lose (Prod).


--
Ray Frush "Either you are part of the solution
T:970.491.5527 or part of the precipitate."
-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
Colorado State University | IS | System Administrator


On Tue, 2018-04-10 at 10:06 +, Philip Parsons (Velindre - Medical 
Physics) wrote:


Dear list,

I’m sure this has been discussed previously, but I couldn’t see 
anything in recent archives that specifically related to v4 of BackupPC.


I am using BackupPC v4.1.5 to backup hospital data to an on-site 
server.  We would however, like to backup this data to another 
off-site server.  Has anyone had any experience of this?


I was wondering if it would be a good idea to set up another instance 
of backuppc on the remote server, turn off all the backup functions, 
copy the config settings, rsync the pool and just have that instance 
as a restorative method (should something happen to our on-site 
copy).  Is this feasible?


I guess there are a number of ways that this could be achieved.  
CPool size is currently approximately 10Tb.  Off-site network speed 
is going to be pretty good (apologies for the vagueness here).


I’d be very interested in anyone’s thoughts, or experiences of 
setting up an off-site replication server with BackupPC v4.


Thanks,

Phil

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org!http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net 
<mailto:BackupPC-users@lists.sourceforge.net>

List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project:http://backuppc.sourceforge.net/



--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



--
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Create Archive of old backup

2017-12-11 Thread Nick Bright

On 12/11/2017 12:49 PM, Gerald Brandt wrote:

Hi,

I have a request from Management to create an archive from a backup I 
did in 2015. Is there a way to archive an old backup?


Gerald
Through the web UI, you could "restore" the backup to a TAR file, and 
download that to your local machine. I don't know if that's the only 
way, but it's certainly a way to do it.


--
-------
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Mark backup for premenent retention

2017-11-29 Thread Nick Bright

On 11/28/2017 11:03 AM, Alexander Kobel wrote:

On 11/28/2017 05:46 PM, Nick Bright wrote:

On 11/28/2017 10:38 AM, Nick Bright wrote:
Is there a way to mark a backup point (in this case, it's an 
incremental) so that the backup (and all backups it depends on) are 
preeminently retained?


e.g. for a server that's failed or been decommissioned?

I may have already done so by disabling backups with 
$Conf{BackupsDisable} = 0;


"Disable all full and incremental backups. These settings are useful 
for a client that is no longer being backed up (eg: a retired 
machine), but you wish to keep the last backups available for 
browsing or restoring to other machines."


Sounds like that does what I'm looking for, and I just skimmed over 
the part about "keep the last backups available". I'm interpreting 
this as it'll retain all existing backups. Could anybody confirm?


Confirmed, except that 0 means "not disabled".

AFAIU, the host still participates in cleanup according to the usual 
settings (FullKeepCnt, FullKeepCntMin, IncrKeepCnt, IncrKeepCntMin), 
but since you certainly didn't set all of those to 0, you will be golden.



Cheers,
Alex

Ah yes, I copy/pasted from the documentation. I did in fact set the host 
for "$Conf{BackupsDisable} = 2;" per the documentation to "suspend all 
backups".


--
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Mark backup for premenent retention

2017-11-28 Thread Nick Bright

On 11/28/2017 10:38 AM, Nick Bright wrote:
Is there a way to mark a backup point (in this case, it's an 
incremental) so that the backup (and all backups it depends on) are 
preeminently retained?


e.g. for a server that's failed or been decommissioned?

I may have already done so by disabling backups with 
$Conf{BackupsDisable} = 0;


"Disable all full and incremental backups. These settings are useful for 
a client that is no longer being backed up (eg: a retired machine), but 
you wish to keep the last backups available for browsing or restoring to 
other machines."


Sounds like that does what I'm looking for, and I just skimmed over the 
part about "keep the last backups available". I'm interpreting this as 
it'll retain all existing backups. Could anybody confirm?


--
-------
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Mark backup for premenent retention

2017-11-28 Thread Nick Bright
Is there a way to mark a backup point (in this case, it's an 
incremental) so that the backup (and all backups it depends on) are 
preeminently retained?


e.g. for a server that's failed or been decommissioned?

--
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Watching restore status

2017-11-21 Thread Nick Bright
Is there a way to watch the restore status? Like being able to tail -f 
an actual log file (not the LOG file in the web UI - that doesn't tell 
me anything).


I'm sitting here waiting for hours, not even knowing if the restore is 
actually doing anything. Need to be able to see what the process is 
doing. Is it working? is it hung? is it just taking a long time to 
compile the rsync list?


Thanks,

--
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Preparing pool for migration

2016-11-04 Thread Nick Bright

On 11/4/2016 6:05 PM, Nick Bright wrote:

On 11/2/2016 8:48 PM, Adam Goryachev wrote:

Next, manually check permissions for backuppc:
Thinking back on it, I don't think I dealt with permissions; that 
would have been the problem and does fit the symptoms.

That will verify that you can at least write there. If that works, then
you could do the following:
chown -R backuppc.backuppc /var/lib/backuppc to ensure all the contents
are also the correct permissions
I let a chown run for about 6 hours before realizing it was a bad 
idea. It had made it through less than one hosts' backups before 
cancelling (perhaps less than 0.05% of the pool).


Instead, I've canceled that (and dispatched a find command to fix what 
was changed), and instead changed the UID:GID on the system to match 
the pool, also dispatching a script to fix ownership on everything 
/other/ than the pool.
I was about to be disappointed after this didn't work, but it seems 
selinux also had to be disabled; most likely because I changed the UID 
numbers if I had to guess.


With SELinux disabled, the copied backups are now visible in the GUI.

--
-------
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.

--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Preparing pool for migration

2016-11-04 Thread Nick Bright

On 11/2/2016 8:48 PM, Adam Goryachev wrote:

Next, manually check permissions for backuppc:
Thinking back on it, I don't think I dealt with permissions; that would 
have been the problem and does fit the symptoms.

That will verify that you can at least write there. If that works, then
you could do the following:
chown -R backuppc.backuppc /var/lib/backuppc to ensure all the contents
are also the correct permissions
I let a chown run for about 6 hours before realizing it was a bad idea. 
It had made it through less than one hosts' backups before cancelling 
(perhaps less than 0.05% of the pool).


Instead, I've canceled that (and dispatched a find command to fix what 
was changed), and instead changed the UID:GID on the system to match the 
pool, also dispatching a script to fix ownership on everything /other/ 
than the pool.


--
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.

--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Preparing pool for migration

2016-11-03 Thread Nick Bright
On 11/2/2016 8:48 PM, Adam Goryachev wrote:
> Don't use rsync, or anything else. If you can't mount the partition, 
> then it's not a BackupPC related issue, and is probably caused 
> bycorruption during the copy process.
I was using rsync to attempt to convert from ext3 to xfs, but that's 
turned out to be a bad idea and I abandoned that process - it was also 
going to take several months to comple
>>> What are the error messages?
>> There were no discernible errors, BackupPC simply didn't work - it
>> showed no backups for any hosts, and when trying to complete the backup
>> would fail and not log the error - almost like it couldn't write to the
>> pool, but I re-verified permissions at least a dozen times. I've
>> reinstalled the OS since then, so any logs are lost.
> OK, so one step at a time... What FS is the source? When you say you
> "converted it to ext4" what exactly did you do?
Followed this document for converting the filesystem: 
https://docs.fedoraproject.org/en-US/Fedora/14/html/Storage_Administration_Guide/ext4converting.html
>   Did you run a fsck afterwards?
Yes, in fact it won't mount without doing an fsck.
> Next, manually check permissions for backuppc:
> 
> check /var/log/messages (syslog and daemon.log etc) for any related
> backuppc startup logs/etc.
I'll check that during the next attempt. My pool is copying again now.

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Preparing pool for migration

2016-11-03 Thread Nick Bright
On 11/2/2016 7:44 PM, Holger Parplies wrote:
> Hi,
>
> Nick Bright wrote on 2016-11-02 17:41:17 -0500 [Re: [BackupPC-users] 
> Preparing pool for migration]:
>> On 11/2/2016 5:29 PM, Adam Goryachev wrote:
>>> Can you describe your "failed pool migrations"? ie, in what way did
>>> they fail?
>> In only one case did the transference get far enough to actually to run
>> backuppc.
> that seems to rule out some things that spring to mind (like missing or
> incorrect configuration files in /etc/backuppc ...). Hmm, or does it? What
> do you mean by "run BackupPC"?
By "run BackupPC" I mean to start the service, and let it do backups.
>   Do the BackupPC installations on source and
> destination machine agree on where the configuration is stored, and did
> you copy it?
I copied the configuration files and updated paths as appropriate, in 
both cases the configuration is in /etc, and $TopDir was the same, but 
the installation path and location of the CGI scripts were different, 
only a few paths needed changed.
> I believe it is (or used to be) possible to store the
> configuration below $TopDir/conf (which would be copied along with "the pool"
> without much thought), but package installations seem to prefer locations
> below /etc (and I agree with them), which you would have to copy separately.
> Well, missing that *would* explain what you are seeing ;-).
All of the machines were in the configuration (showing in the CGI 
interface), they just showed 0 backups.
>> Another attempt, using dd, did result in a mountable partition, which I
>> converted to ext4. Backuppc simply couldn't see any backups in the pool,
>> and wouldn't successfully complete any new backups.
> SElinux?
Disabled.
> Did you try to write to the disk as backuppc user from a shell?
Only to its' home directory, not to the pool.
> Did you try to write as root? 
Yes, that was fine.

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Preparing pool for migration

2016-11-02 Thread Nick Bright
On 11/2/2016 5:29 PM, Adam Goryachev wrote:
> Can you describe your "failed pool migrations"? ie, in what way did 
> they fail?
In only one case did the transference get far enough to actually to run 
backuppc.

Several attempts, using an intermediary physical disk, resulted in an 
unmountable partition or took too long and were aborted (attempting to 
use rsync).

Another attempt, using dd, did result in a mountable partition, which I 
converted to ext4. Backuppc simply couldn't see any backups in the pool, 
and wouldn't successfully complete any new backups.

I posted to the list about that a few days ago on 10/24, but no one 
responded.

> What did you do (exact commands would help)? What happened? 
The most successful copying method (in terms of both speed and resulting 
in a mountable partition) is using DD over Netcat:

 dd bs=1M in=/dev/src | nc  }--network--{ nc | dd bs=1M of=/dev/dst

This runs reasonably fast, taking only about 8 to 10 hours to copy the data.

> What are the error messages?
There were no discernible errors, BackupPC simply didn't work - it 
showed no backups for any hosts, and when trying to complete the backup 
would fail and not log the error - almost like it couldn't write to the 
pool, but I re-verified permissions at least a dozen times. I've 
reinstalled the OS since then, so any logs are lost.

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Preparing pool for migration

2016-11-02 Thread Nick Bright
A thought occured to me that perhaps my failed pool migrations have been 
as a result of not preparing the pool properly.

Previously, I had simply stopped any running backups then stopped the 
backuppc service, unmounted the partition, and DD'd it to the new host.

Is there a proper preparation procedure that isn't documented on the 
website?

For example, disable all backups on all hosts, and wait for the 
overnight maintenance and cleanup processes to finish?

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
Developer Access Program for Intel Xeon Phi Processors
Access to Intel Xeon Phi processor-based developer platforms.
With one year of Intel Parallel Studio XE.
Training and support from Colfax.
Order your platform today. http://sdm.link/xeonphi
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Version 4 vs 3

2016-10-28 Thread Nick Bright

On 10/28/2016 11:56 AM, Nicholas Hall wrote:


I've tried both Ext4 and XFS. My issue wasn't related to the
filesystem. It's BPC4 that is forcing a full fsck after each backup.


I think v4 has had enough real world testing where it can be 
disabled.  See Craig's comment:

https://github.com/backuppc/backuppc/issues/4

I've been running v4 for over a year now without problems.  My host 
summary:


There are 868 hosts that have been backed up, for a total of:

868 full backups of total size 89.02GiB (prior to pooling and 
compression),
22704 incr backups of total size 3103.35GiB (prior to pooling and 
compression).

Anybody running v4 with stats similar to this:

There are 21 hosts that have been backed up, for a total of:

 * 180 full backups of total size 8089.80GB (prior to pooling and
   compression),
 * 96 incr backups of total size 290.16GB (prior to pooling and
   compression).

 * Pool is 870.52GB comprising 11700700 files and 4182 directories (as
   of 10/28 03:30),
 * Pool hashing gives 656 repeated files with longest chain 26,

Those stats do seem a little... off, though; as the actual disk usage is 
935GB.


--
-------
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.

--
The Command Line: Reinvented for Modern Developers
Did the resurgence of CLI tooling catch you by surprise?
Reconnect with the command line and become more productive. 
Learn the new .NET and ASP.NET CLI. Get your free copy!
http://sdm.link/telerik___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Version 4 vs 3

2016-10-27 Thread Nick Bright
On 10/27/2016 10:16 AM, Bowie Bailey wrote:
> The BackupPC project has a single developer who tends to be rather busy
> most of the time, so development happens in bursts with months or years
> between releases.
That's a lot of work for one person! I for one am grateful for the 
software, though I don't think there's any way I can meaningfully 
contribute to help (I'm not a programmer).
> I have not used version 4 myself, but as I understand it, the main
> difference is the use of hard links in the pool.  Version 3 uses hard
> links in the backup pool.  Every file has a minimum of 2 hard links and
> frequently more.  This makes it very difficult to copy or move the pool
> using rsync or other normal methods.  The only good way to move a large
> BPC v3 pool directory is to do a block copy of the filesystem.
I can attest to this. I've been trying to move my pool to a new machine 
for, quite literally, six weeks. Nothing seems to work reliably, and any 
attempt other than a dd|nc takes an estimated 80 days to complete - and 
I don't think my pool is really that big, about 15 machines on 1.2TB 
(All CentOS 6 or 7).

> Version 3 still works quite well and is what I use here, but I know
> there are some people on the list who have been using it version 4 for
> some time now.
Version 3 works *quite* well. I'm curious about v4 though, is it stable 
enough for production use?

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
The Command Line: Reinvented for Modern Developers
Did the resurgence of CLI tooling catch you by surprise?
Reconnect with the command line and become more productive. 
Learn the new .NET and ASP.NET CLI. Get your free copy!
http://sdm.link/telerik
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Copying the pool / filesystem migration

2016-10-19 Thread Nick Bright
Greetings,

I'm in the process of migrating to a new BackupPC server, my old machine 
having software RAID5 on SATA; it was just getting a bit outdated and 
more than a bit starved for IOPS. The new machine (a VM, though the host 
is dedicated) is RAID10 SAS MDL (7.2krpm) across 8 spindles on a 
P410/512MB FBWC - a far superior build for IOPS.

The old machine is using ext3 on its' filesystem, as it was a direct 
filesystem move from the machine before that (which was CentOS6).

So, I'm stuck with ext3 on slow hardware; trying to move to xfs on the 
new faster hardware. Getting the data to the new machine is easy enough 
- I've done it twice already; once with an intermediary disk physically 
moving it between machines, and once over the network. The network is 
just as fast as a physical disk, as the decrease in speed still 
outweighs having to copy the data twice.

The real problem I have is in converting the ext3 filesystem to xfs.

I've staged the copy as two different disks in the guest, one containing 
the ext3 filesystem (which i can later dispose of), and one containing 
the xfs filesystem. Using rsync -aH, the copy went to about 950/1200GB 
then slowed to a crawl, getting perhaps 2-4GB per day; because it's in 
the hardlink territory of the backuppc store.

I tried using the BackupPC_tarPCCopy instead of rsync, but the command 
refused to work. It stated an error about the pool root configuration, 
even though the configuration was correct. I was unable to resolve the 
error.

What strategies or suggestions could the community make? At this rate, 
it's going to take another THREE MONTHS to copy the pool between 
filesystems, a time during which this server isn't making backups.

The old server is, but at the end of it all i'm faced with trying to 
merge the pools (probably functionally impossible given the performance 
issues), or having a substantial gap in my backups. Neither option is 
appealing. I'm OK with a gap in backups, but I'd like to contain it to a 
week or two, not an entire quarter.

-- 
---
-  Nick Bright-
-  Vice President of Technology   -
-  Valnet -=- We Connect You -=-  -
-  Tel 888-332-1616 x 315 / Fax 620-331-0789  -
-  Web http://www.valnet.net/ -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information & 30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---

This email message and any attachments are intended solely for the use of the 
addressees hereof. This message and any attachments may contain information 
that is confidential, privileged and exempt from disclosure under applicable 
law. If you are not the intended recipient of this message, you are prohibited 
from reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply E-mail and immediately delete this message 
from your system.


--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to delete specific files from backups? (with BackupPC_deleteFile.pl)

2011-05-23 Thread Nick Bright
On 5/23/2011 10:14 AM, Jeffrey J. Kosowsky wrote:
>   >  The usage *clearly* gives a list of required input parameters,
>   >  including share name:
>   >   -s
>   >  Also, the usage says to use '-m' if you want to use unmangled paths.
>   >
>   >  So true there are not any specific examples, but the usage (-h) is
>   >  rather completely documented...

Just saying what's expected isn't the same as giving an example of valid 
input. My root question really came down to "What is a share? What goes 
there?", "How do I specify a filename?" and "What does mangled mean? How 
do I mangle?". If you already *know* these things, it's very simple. If 
you don't, then the lack of examples makes it very difficult to 
understand what to do. I understand what these mean now, thanks to your 
thorough explanation.

> To clarify, here is the quote from the usage:
>  -s Share name (or - for all) from which path is offset
>(don't include the 'f' mangle)
>NOTE: if unmangle option (-m) is not set then the share 
> name
>is optional and if not specified then it must instead be
>included in mangled form as part of the file/directory 
> names.
>
> So, you can do any of the following assuming your share name is '/'
>
> With mangling
> BackupPC_deleteFile.pl -h hostname -n - -d 4 f%2f/fvar/flog/fmaillog
> BackupPC_deleteFile.pl -h hostname -n - -s / -d 4 /fvar/flog/fmaillog
> BackupPC_deleteFile.pl -h hostname -n - -s %2f -d 4 /fvar/flog/fmaillog
>
> Without mangling:
> BackupPC_deleteFile.pl -h hostname -n - -s / -m -d 4 /var/log/maillog
> BackupPC_deleteFile.pl -h hostname -n - -s %2f -m -d 4 /var/log/maillog
>
> You could also use '-s -' to include any share and you could remove
> the leading slash before (f)var in all but the first example.

Thank you very much Jeffrey. The problem was my lack of understanding 
regarding what the "share name" is and represents. Having "set it and 
forget it" with BackupPC almost three years ago, I didn't recall those 
basic concepts.

I would suggest, to help out people such as myself who may not be strong 
in all of the concepts of BackupPC, to put what you've replied to me 
with on to the wiki page for the BackupPC_deleteFile script, as your 
explanation is quite clear and concise.

Again, thank you for taking the time to reply with very helpful information.

-- 
---
-  Nick Bright-
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information&  30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
What Every C/C++ and Fortran developer Should Know!
Read this article and learn how Intel has extended the reach of its 
next-generation tools to help Windows* and Linux* C/C++ and Fortran 
developers boost performance applications - including clusters. 
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to delete specific files from backups? (with BackupPC_deleteFile.pl)

2011-05-22 Thread Nick Bright
On 5/22/2011 7:14 PM, Nick Bright wrote:
> Sounds to me like the BackupPC_deleteFile script is the way to go:
> http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=BackupPC_DeleteFile
>
> I found it some time after posting my original question.
>

I am having a great deal of difficulty getting this to work. It runs OK 
but the output is just:

[hostname][][472 483 507 531 541 551 562 574 584 589 594 595 596 597 598 
599 600 601][]
   ANTE[hostname]:
   BAKS[hostname]: 472 483 507 531 541 551 562 574 584 589 594 595 596 
597 598 599 600 601
   POST[hostname]:
   LOOKING AT: [hostname] [][472 483 507 531 541 551 562 574 584 589 594 
595 596 597 598 599 600 601][] **NO DELETIONS ON THIS HOST**


Files/directories deleted: 0(0) Files/directories copied: 0
Delete attrib set: 0Attributes cleared: 0
Empty attrib files deleted: 0   Errors: 0

No matter what options I give it, it just won't delete anything.

There is a complete void of examples, and there is no indication of what 
valid inputs are for the arguments are in the documention, so I'm not 
even sure if I'm doing it correctly.

I've tried:

BackupPC_deleteFile.pl -h hostname -n - -d 4 /var/log/maillog
BackupPC_deleteFile.pl -h hostname -n 600 -d 4 /var/log/maillog
BackupPC_deleteFile.pl -h hostname -n 600 -d 4 
/backup/backuppc/pc/hostname/600/f%2f/fvar/flog/fmaillog
BackupPC_deleteFile.pl -h hostname -n - -d 4 /fvar/flog/fmaillog

But they all just give the same output - nothing deleted.

The key thing that's missing from the documentation is how to specify 
the file/directory that you would like to delete. Do you specify the 
file on the local filesystem? Do you specify the file on the remote host 
that's been backed up? Do you have to specify the full path name, or the 
relative path name?

Any advice appreciated

-- 
-------
- Nick Bright -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information&  30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
What Every C/C++ and Fortran developer Should Know!
Read this article and learn how Intel has extended the reach of its 
next-generation tools to help Windows* and Linux* C/C++ and Fortran 
developers boost performance applications - including clusters. 
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to delete specific files from backups?

2011-05-22 Thread Nick Bright
On 5/22/2011 5:48 PM, Jeffrey J. Kosowsky wrote:
> Nick Bright wrote at about 16:06:32 -0500 on Sunday, May 22, 2011:
>   >  On 5/22/2011 3:29 PM, Michael Stowe wrote:
>   >  >>  Recently I had a bit of an error condition that generated several 
> very,
>   >  >>  very large files on the file system of a server being backed up by
>   >  >>  BackupPC. This resulted in>200GB of files having been backed up to 
> the
>   >  >>  backuppc server that quite simply don't need to be there!
>   >  >>
>   >  >>  What can I do to remove these specific files from all backups on the
>   >  >>  backuppc server? It's just a waste of space so I really need to make
>   >  >>  them go away.
>   >  >>
>   >  >> - Nick
>   >  >  Probably the simplest way is to search for the file (and all its hard
>   >  >  links) in your backuppc tree and ... delete them.  If you have no 
> other
>   >  >  files that large, it's a fairly straightforward "find" command.
>   >  >
>   >  I thought of that, but I wasn't sure if it would "break" anything. Guess
>   >  I should have said so :)
>   >
>
> In general it *definitely* breaks things -- whether you care that it
> breaks things and whether that breakage is critical is a different
> story -- specifically it breaks the meta data stored in the attrib
> files and can cause problems with filling in incrementals...
>
Sounds to me like the BackupPC_deleteFile script is the way to go:

http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=BackupPC_DeleteFile

I found it some time after posting my original question.

-- 
---
- Nick Bright -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information&  30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
What Every C/C++ and Fortran developer Should Know!
Read this article and learn how Intel has extended the reach of its 
next-generation tools to help Windows* and Linux* C/C++ and Fortran 
developers boost performance applications - including clusters. 
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to delete specific files from backups?

2011-05-22 Thread Nick Bright
On 5/22/2011 3:29 PM, Michael Stowe wrote:
>> Recently I had a bit of an error condition that generated several very,
>> very large files on the file system of a server being backed up by
>> BackupPC. This resulted in>200GB of files having been backed up to the
>> backuppc server that quite simply don't need to be there!
>>
>> What can I do to remove these specific files from all backups on the
>> backuppc server? It's just a waste of space so I really need to make
>> them go away.
>>
>>- Nick
> Probably the simplest way is to search for the file (and all its hard
> links) in your backuppc tree and ... delete them.  If you have no other
> files that large, it's a fairly straightforward "find" command.
>
I thought of that, but I wasn't sure if it would "break" anything. Guess 
I should have said so :)

-- 
---
- Nick Bright -
---
- Are your files safe?-
- Valnet Vault - Secure Cloud Backup  -
- More information&  30 day free trial at -
- http://www.valnet.net/services/valnet-vault -
---


--
What Every C/C++ and Fortran developer Should Know!
Read this article and learn how Intel has extended the reach of its 
next-generation tools to help Windows* and Linux* C/C++ and Fortran 
developers boost performance applications - including clusters. 
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] How to delete specific files from backups?

2011-05-22 Thread Nick Bright
Recently I had a bit of an error condition that generated several very, 
very large files on the file system of a server being backed up by 
BackupPC. This resulted in >200GB of files having been backed up to the 
backuppc server that quite simply don't need to be there!

What can I do to remove these specific files from all backups on the 
backuppc server? It's just a waste of space so I really need to make 
them go away.

  - Nick

--
What Every C/C++ and Fortran developer Should Know!
Read this article and learn how Intel has extended the reach of its 
next-generation tools to help Windows* and Linux* C/C++ and Fortran 
developers boost performance applications - including clusters. 
http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 12:48 PM, Josh Malone wrote:
> On Tue, 11 May 2010 12:35:27 -0500, Nick Bright  wrote:
>
>> [backu...@backuppc tmp]$ /usr/bin/ssh -q -x -l root TargetServer
>> /usr/bin/rsync --version
>> rsync  version 3.0.7  protocol version 30
>> Copyright (C) 1996-2009 by Andrew Tridgell, Wayne Davison, and others.
>> Web site: http://rsync.samba.org/
>> Capabilities:
>>   64-bit files, 64-bit inums, 32-bit timestamps, 64-bit long ints,
>>   socketpairs, hardlinks, symlinks, IPv6, batchfiles, inplace,
>>   append, ACLs, xattrs, iconv, no symtimes
>>
>> rsync comes with ABSOLUTELY NO WARRANTY.  This is free software, and you
>> are welcome to redistribute it under certain conditions.  See the GNU
>> General Public Licence for details.
>>
>> I just checked the authorized_keys file, nothing out of the ordinary.
>> Just the RSA key and comment. I'm sure that the problem is with the
>> Target server's paranoia - I just can't *find* it.
>>  
> Okay - the it's got to be something with the weird SSH port. Let's see if
> you can make rsync do something on the target box. Try rsyncing from the
> directory you want to back up into /tmp/test/ or something. This might tell
> us if it's a messed up rsync or something on the remote box.
>
> If that works then try removing the port option on your rsync command. It
> really should ne needed as rsync just sends the data stream over stdout...
> I think.
>
> -Josh
>
>
That's a good point. I really thought I was specifying that it should be 
using the alternative SSH port.

This is the root of the problem, I took the liberty of changing the SSHD 
port back to 22 and it's backing up.

Search hits take note: this is why you have to be careful stripping 
"sensitive" information out of debugging posts!

Thanks very much for taking the time to reply!

  - Nick

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 12:45 PM, Les Mikesell wrote:
> On 5/11/2010 12:26 PM, Nick Bright wrote:
>
>> On 5/11/2010 9:53 AM, Les Mikesell wrote:
>>  
>>> On 5/10/2010 8:14 PM, Nick Bright wrote:
>>>
>>>
>>>> Let me start off by saying this: I know what I'm doing.
>>>>
>>>>
>>>>  
>>> [...]
>>>
>>>
>>>> full backup started for directory / (baseline backup #259)
>>>> started full dump, share=/
>>>> Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server
>>>> --sender --numeric-ids --perms --owner --group -D --links --hard-links
>>>> --times --block-size=2048 --recursive --port=38271 --ignore-times . /
>>>>
>>>>  
>>> What does --port=38271 mean here?  Isn't that supposed to be used for
>>> standalone rsyncd, not over ssh?
>>>
>>>
>>>
>> Thank you for taking the time to reply.
>>
>> It's the port that SSHD is listening on. I had been stripping that out
>> because the guy that runs the target server is a little paranoid about
>> his SSH access.
>>  
> In that case, shouldn't it be in the ssh arguments, not part of the
> rsync command passed to the remote?  I don't see how this would connect
> at all.
>
>

Precisely correct. I removed those port arguments and set the SSH server 
on the target machine back to port 22 and it's working. Now I need to 
figure out how to properly tell BackupPC that SSH is on a non-standard 
port. I thought that I was doing it properly, but clearly not. All my 
other boxes use the standard ports, which is why I've not had this issue 
before.

Thank you very much for taking the time to reply and help find the 
resolution to the issue.

  - Nick

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 12:30 PM, Josh Malone wrote:
> On Tue, 11 May 2010 12:26:47 -0500, Nick Bright  wrote:
>
>> On 5/11/2010 9:53 AM, Les Mikesell wrote:
>>  
>>> On 5/10/2010 8:14 PM, Nick Bright wrote:
>>>
>>>
>>>> Let me start off by saying this: I know what I'm doing.
>>>>
>>>>
>>>>  
>>> [...]
>>>
>>>
>>>> full backup started for directory / (baseline backup #259)
>>>> started full dump, share=/
>>>> Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync
>>>>  
> --server
>
>>>> --sender --numeric-ids --perms --owner --group -D --links --hard-links
>>>> --times --block-size=2048 --recursive --port=38271 --ignore-times . /
>>>>
>>>>  
>>> What does --port=38271 mean here?  Isn't that supposed to be used for
>>> standalone rsyncd, not over ssh?
>>>
>>>
>>>
>> Thank you for taking the time to reply.
>>
>> It's the port that SSHD is listening on. I had been stripping that out
>> because the guy that runs the target server is a little paranoid about
>> his SSH access.
>>  
> Can you run:
>
>/usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --version
>
> And get a proper output? If not then something is up with the remote
> server. You mention paranoia of the remote server admin -- is it possible
> that in his authorized_keys file he's limited the command that can be run
> via that key? If so, is it correct. That's a huge recipe for ssh disaster
> in my experience w/ backuppc.
>
>
>

[backu...@backuppc tmp]$ /usr/bin/ssh -q -x -l root TargetServer 
/usr/bin/rsync --version
rsync  version 3.0.7  protocol version 30
Copyright (C) 1996-2009 by Andrew Tridgell, Wayne Davison, and others.
Web site: http://rsync.samba.org/
Capabilities:
 64-bit files, 64-bit inums, 32-bit timestamps, 64-bit long ints,
 socketpairs, hardlinks, symlinks, IPv6, batchfiles, inplace,
 append, ACLs, xattrs, iconv, no symtimes

rsync comes with ABSOLUTELY NO WARRANTY.  This is free software, and you
are welcome to redistribute it under certain conditions.  See the GNU
General Public Licence for details.

I just checked the authorized_keys file, nothing out of the ordinary. 
Just the RSA key and comment. I'm sure that the problem is with the 
Target server's paranoia - I just can't *find* it.

  - Nick

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 9:53 AM, Les Mikesell wrote:
> On 5/10/2010 8:14 PM, Nick Bright wrote:
>
>> Let me start off by saying this: I know what I'm doing.
>>
>>  
> [...]
>
>> full backup started for directory / (baseline backup #259)
>> started full dump, share=/
>> Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server
>> --sender --numeric-ids --perms --owner --group -D --links --hard-links
>> --times --block-size=2048 --recursive --port=38271 --ignore-times . /
>>  
> What does --port=38271 mean here?  Isn't that supposed to be used for
> standalone rsyncd, not over ssh?
>
>
Thank you for taking the time to reply.

It's the port that SSHD is listening on. I had been stripping that out 
because the guy that runs the target server is a little paranoid about 
his SSH access.

--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 8:19 AM, Josh Malone wrote:
> On Mon, 10 May 2010 20:14:54 -0500, Nick Bright  wrote:
>
>> Let me start off by saying this: I know what I'm doing.
>>
>> I've been running backuppc for about two years on two servers, backing
>> up about 25 servers (mix of windows and linux).
>>
>> This is the ONLY machine I've ever had this problem with that wasn't SSH
>>  
>
>> authentication problems, and what's worse is that it worked for almost a
>>  
>
>> year before it stopped working. I'm convinced it's something about the
>> shell or environment of the client system, but I've been trying to
>> figure this out since last *November* and I'm just not getting anywhere
>> with it. Every single hit just says "your SSH keys are messed up", but
>> they most certainly are /not/ messed up, as evidenced below.
>>
>> All of the configuration is the same as my other linux servers. I can
>> find absolutely nothing preventing this from working, but it fails every
>>  
>
>> time!
>>  
> 
>
> Hi. I run BackupPC on RHEL5 and I've never had the slightest problems.
> That said, here's all I can think of:
>
> Have you checked the 'messages' and 'secure' logs on the target server?
> Are your target servers (the backed-up hosts) also running CentOS5?
>
Thank you for taking the time to reply.

Yes, they don't say anything at all. No rejections or failures or 
successes. Just blank.

The target server is also CentOS5.

> I would try running the actual rsync-over-ssh command as the backuppc user
> on the backuppc server:
>
> usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server \
>--sender --numeric-ids --perms --owner --group -D --links \
>--hard-links --times --block-size=2048 --recursive --port=38271 \
>--ignore-times . /
>
> And see if you get anything from that command or in the logs of the target
> server.
>

Nothing at all. No errors, no success messages. Just blank.

> If all else fails, try starting sshd in the foreground (sshd -D) on the
> target server and watch the connection and process start up.
>
> -Josh
>
>


--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-11 Thread Nick Bright
On 5/11/2010 5:42 AM, Dmitry Katsubo wrote:
> Maybe you have some tricky alias for ssh? For example I have:
>
> $ type ssh
> ssh is aliased to `ssh -C'
>
Thank you for taking the time to reply.

If you mean on the BackupPC server:

[backu...@backuppc~]$ type ssh
ssh is /usr/bin/ssh

> So try to run exactly the same command (with full path) as is noticed in
> log file from the shell:
>

I had tried that in the past as well, but I really wasn't sure what it 
ought to do since it doesn't specify where to put the files and such. 
Regardless of that, when I run that command nothing happens. I'm 
instantly returned to a prompt with no delay whatsoever:

[backu...@backuppc tmp]$ /usr/bin/ssh -q -x -l root TargetServer 
/usr/bin/rsync --server --sender --numeric-ids --perms --owner --group 
-D --links --hard-links --times --block-size=2048 --recursive 
--ignore-times . /
[backu...@backuppc tmp]$

> Nick Bright wrote on 11/05/2010 03:14:
>
>> Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server
>> --sender --numeric-ids --perms --owner --group -D --links --hard-links
>> --times --block-size=2048 --recursive --ignore-times . /
>> Xfer PIDs are now 2924
>> Read EOF: Connection reset by peer
>> Tried again: got 0 bytes
>> Done: 0 files, 0 bytes
>> Got fatal error during xfer (Unable to read 4 bytes)
>> Backup aborted (Unable to read 4 bytes)
>> Not saving this as a partial backup since it has fewer files than the
>> prior one (got 0 and 0 files versus 0)
>>  
>
>


--

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] The dread "Unable to read 4 bytes" / "Read EOF: Connection reset by peer"

2010-05-10 Thread Nick Bright
Let me start off by saying this: I know what I'm doing.

I've been running backuppc for about two years on two servers, backing 
up about 25 servers (mix of windows and linux).

This is the ONLY machine I've ever had this problem with that wasn't SSH 
authentication problems, and what's worse is that it worked for almost a 
year before it stopped working. I'm convinced it's something about the 
shell or environment of the client system, but I've been trying to 
figure this out since last *November* and I'm just not getting anywhere 
with it. Every single hit just says "your SSH keys are messed up", but 
they most certainly are /not/ messed up, as evidenced below.

All of the configuration is the same as my other linux servers. I can 
find absolutely nothing preventing this from working, but it fails every 
time!

Please. I'm _BEGGING_ for something here, I have NO idea where else to 
look! There is unfortunately no extended troubleshooting information 
available in the documentation located at 
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=ErrorMessages 
or http://backuppc.sourceforge.net/faq/ssh.html or 
http://backuppc.sourceforge.net/faq/BackupPC.html#step_5

The target machine is CentOS5 running cPanel, and I've updated rsync to 
rsync-3.0.7-1.el5.rf (from RPM Forge, had to do that on another box as 
well).

What else can I do to find extended debugging information?

The xferlog always says:

full backup started for directory / (baseline backup #259)
Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server 
--sender --numeric-ids --perms --owner --group -D --links --hard-links 
--times --block-size=2048 --recursive --ignore-times . /
Xfer PIDs are now 2924
Read EOF: Connection reset by peer
Tried again: got 0 bytes
Done: 0 files, 0 bytes
Got fatal error during xfer (Unable to read 4 bytes)
Backup aborted (Unable to read 4 bytes)
Not saving this as a partial backup since it has fewer files than the 
prior one (got 0 and 0 files versus 0)

Yet, from a shell on the BackupPC server, as the backuppc user, I have 
no problems SSHing to the target:

[backu...@backuppcserver ~]$ ssh r...@targetserver
Last login: Mon May 10 19:36:24 2010 from 192.168.50.1
-bash-3.2#
[backu...@backuppcserver ~]$ ssh r...@targetserver whoami
root
[backu...@backuppcserver ~]$ ssh -q -x -l root TargetServer
Last login: Mon May 10 19:42:03 2010 from 192.168.50.1
-bash-3.2#

[backu...@backuppcserver ~]$ /usr/local/backuppc/bin/BackupPC_dump -v 
TargetServer
cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 TargetServer
cmdSystemOrEval: finished: got output PING TargetServer (192.168.50.2) 
56(84) bytes of data.
64 bytes from TargetServer (192.168.50.2): icmp_seq=1 ttl=64 time=0.133 ms

--- TargetServer ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.133/0.133/0.133/0.000 ms

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 TargetServer
cmdSystemOrEval: finished: got output PING TargetServer (192.168.50.2) 
56(84) bytes of data.
64 bytes from TargetServer (192.168.50.2): icmp_seq=1 ttl=64 time=0.212 ms

--- TargetServer ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.212/0.212/0.212/0.000 ms

CheckHostAlive: returning 0.212
full backup started for directory / (baseline backup #259)
started full dump, share=/
Running: /usr/bin/ssh -q -x -l root TargetServer /usr/bin/rsync --server 
--sender --numeric-ids --perms --owner --group -D --links --hard-links 
--times --block-size=2048 --recursive --port=38271 --ignore-times . /
Xfer PIDs are now 3040
xferPids 3040
Read EOF: Connection reset by peer
Tried again: got 0 bytes
Done: 0 files, 0 bytes
Got fatal error during xfer (Unable to read 4 bytes)
cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 TargetServer
cmdSystemOrEval: finished: got output PING TargetServer (192.168.50.2) 
56(84) bytes of data.
64 bytes from TargetServer (192.168.50.2): icmp_seq=1 ttl=64 time=0.167 ms

--- TargetServer ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.167/0.167/0.167/0.000 ms

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 TargetServer
cmdSystemOrEval: finished: got output PING TargetServer (192.168.50.2) 
56(84) bytes of data.
64 bytes from TargetServer (192.168.50.2): icmp_seq=1 ttl=64 time=0.165 ms

--- TargetServer ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.165/0.165/0.165/0.000 ms

CheckHostAlive: returning 0.165
Backup aborted (Unable to read 4 bytes)
Not saving this as a partial backup since it has fewer files than the 
prior one (got 0 and 0 files versus 0)
dump failed: Unable to read 4 bytes

Any help or tips are greatly appreciated.

Thank you!

   - Nick

--

___
BackupPC-users mailing list
BackupPC

Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-24 Thread Nick Bright
Pursuing your suggestion when I executed "ssh -l root -T remote 
/bin/true" I was greeted with "stdin: is not a tty".

My googling resulted in

http://platonic.techfiz.info/2008/10/13/stdin-is-not-a-tty/

which indicates that the culprit is apparently "mesg y" in /etc/bashrc. 
After commenting out "mesg y" in my /etc/bashrc, "ssh -l root -T remote 
/bin/true" receives no output as you indicate is correct.

I have kicked off a full backup and I'll see how it goes, but I'm sure 
that it will work properly now that the "stdin: is not a tty" is gone.

For reference anyone hitting this on a search "mesg y" allows the 'wall' 
and 'talk' or other such commands to write to your TTY for intra-system 
chat on TTY's.

Thank you everybody for your help in clearing this problem up!

  - Nick Bright

Adam Goryachev wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
> 
> Holger Parplies wrote:
>> Nick Bright wrote on 2009-11-09 17:57:11 -0600 [Re: [BackupPC-users] Backup 
>> fails after running 8-10 hours]:
>>> Les Mikesell wrote:
>>>> Nick Bright wrote:
>>>>> Got remote protocol 1768191091
>>>>> Fatal error (bad version): stdin: is not a tty
>> note the error message here. "stdin: is not a tty".
>>
>>> [...]
>>> Thank you for your reply. I checked in to it, and determined that there 
>>> isn't anything being output by logging in to the backuppc system and "su 
>>> backuppc" then "ssh r...@cpanel":
>> It's not output, it's probably an 'stty' or something else that expects its
>> stdin to be a tty. Note that if you try it like you did ("ssh -l root cpanel"
>> without command argument), that's not the same as BackupPC is doing. stdin
>> *will* be a tty in this case, so you won't get the error. You should try
>> something like "ssh -l root cpanel /bin/true" instead.
> 
> Or better would be:
> ssh -l root -T cpanel /bin/true
> and you should get this:
> backu...@host:~$ ssh -l root -T remotehost /bin/true
> backu...@host:~$
> 
> ie, no blank line, no error, no text, nothing at all. If you see
> *anything* then you need to fix that first.
> 
> BTW, the -T means "Disable pseudo-tty allocation."
> 
> Regards,
> Adam
> 
> - --
> Adam Goryachev
> Website Managers
> www.websitemanagers.com.au
> -BEGIN PGP SIGNATURE-
> Version: GnuPG v1.4.9 (GNU/Linux)
> Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
> 
> iEYEARECAAYFAksF8G8ACgkQGyoxogrTyiUPPQCfVgEDJe33YH7W15Axd6DIYSJU
> 4QMAn1cX+l1sQ44S85nLz3WSfy21d6mA
> =R9gq
> -END PGP SIGNATURE-
> 

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-12 Thread Nick Bright
I didn't try it with rsyncd.

  - Nick

Shawn Perry wrote:
> Does it work with rsyncd?
> 
> On Tue, Nov 10, 2009 at 2:28 PM, Nick Bright  wrote:
>> The backup successfully completed with the "tar" method.
>>
>> Shawn Perry wrote:
>>> That sounds like a different sort of problem then.
>>>
>>> A deduplicator is a program that walks through a filesystem and finds
>>> identical files, and then hard links them together to save space.
>>>
>>> On Tue, Nov 10, 2009 at 12:57 PM, Nick Bright  
>>> wrote:
>>>> Shawn Perry wrote:
>>>>> Did you use a disk deduplicator on the drive? Is there a directory
>>>>> with alot of files in it?  How many files are you backing up?
>>>> Sorry, I'm not familiar with a "deduplicator".
>>>>
>>>> There aren't any directories with "a lot" of files any more than any of
>>>> the other systems I'm backing up.
>>>>
>>>> There are 202,984 files on the system.
>>>>
>>>>> If you have MANY hardlinks on a file system, rsync with the
>>>>> --hard-links option has a tendency to croak, leaving tar as the best
>>>>> option.
>>>> find / -printf "%n %i %p\n" | sort -nr
>>>>
>>>> Doesn't seem to indicate that there is an unusually large number of hard
>>>> links. The only stuff listed with a sizable amount of hard links appear
>>>> to be directories that are all system stuff that would exists on all
>>>> servers.
>>>>
>>>> The system itself is a cPanel hosting server, and hasn't had anything
>>>> special done to it. Let me put it this way - I didn't do anything to
>>>> knowingly create "a lot" of hardlinks. I'm sure there's some, but
>>>> probably not an unusually high number.
>>>>
>>>>> Dirvish has this same issue.
>>>>>
>>>>> To answer your question, find a directory or a couple of them that
>>>>> have a lot of files.  run "ls -l" or "ls -lR" (the latter is
>>>>> recursive) in that directory.  Look at the output.
>>>>>
>>>>> sample:
>>>>>
>>>>> -rw-r--r-- 79 shawn users   37888 2005-12-04 14:36 X-mas list.xls
>>>>>
>>>>> The first field after the permissions us the number of links to the
>>>>> data in that file, 79 in this case.  That means there are 79 hard
>>>>> links to that file.  There will always be at least one.
>>>> Similar to the output of my find command, which was telling me how many
>>>> hard links it found for each file/directory on the system. As I said,
>>>> nothing that seemed to unusual.
>>>>
>>>>> Shawn
>>>>>
>>>> You mentioned TAR being a better option on a system with lots of hard
>>>> links. I'll give that a try and see if it's able to perform a successful
>>>> backup.
>>>>
>>>> I will point out that I have a 2nd BackupPC server that is backing up a
>>>> *different* machine running the cPanel system, which has many, many,
>>>> many more files/domains on it; and that is successfully backing up.
>>>>
>>>> I'll also try backing the client in question up to said 2nd BackupPC
>>>> server and see if that works.
>>>>
>>>>> On Tue, Nov 10, 2009 at 10:05 AM, Nick Bright  
>>>>> wrote:
>>>>>>> Shawn Perry wrote:
>>>>>>> Does this host have alot of hard links?
>>>>>>>
>>>>>> That's a good question that I have no idea how to answer.
>>>>>>
>>>>>>
>>>>> --
>>>>> Let Crystal Reports handle the reporting - Free Crystal Reports 2008 
>>>>> 30-Day
>>>>> trial. Simplify your report design, integration and deployment - and 
>>>>> focus on
>>>>> what you do best, core application coding. Discover what's new with
>>>>> Crystal Reports now.  http://p.sf.net/sfu/bobj-july
>>>>> ___
>>>>> BackupPC-users mailing list
>>>>> BackupPC-users@lists.sourceforge.net
>>>>> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
>>

Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-10 Thread Nick Bright
The backup successfully completed with the "tar" method.

Shawn Perry wrote:
> That sounds like a different sort of problem then.
> 
> A deduplicator is a program that walks through a filesystem and finds
> identical files, and then hard links them together to save space.
> 
> On Tue, Nov 10, 2009 at 12:57 PM, Nick Bright  wrote:
>> Shawn Perry wrote:
>>> Did you use a disk deduplicator on the drive? Is there a directory
>>> with alot of files in it?  How many files are you backing up?
>> Sorry, I'm not familiar with a "deduplicator".
>>
>> There aren't any directories with "a lot" of files any more than any of
>> the other systems I'm backing up.
>>
>> There are 202,984 files on the system.
>>
>>> If you have MANY hardlinks on a file system, rsync with the
>>> --hard-links option has a tendency to croak, leaving tar as the best
>>> option.
>> find / -printf "%n %i %p\n" | sort -nr
>>
>> Doesn't seem to indicate that there is an unusually large number of hard
>> links. The only stuff listed with a sizable amount of hard links appear
>> to be directories that are all system stuff that would exists on all
>> servers.
>>
>> The system itself is a cPanel hosting server, and hasn't had anything
>> special done to it. Let me put it this way - I didn't do anything to
>> knowingly create "a lot" of hardlinks. I'm sure there's some, but
>> probably not an unusually high number.
>>
>>> Dirvish has this same issue.
>>>
>>> To answer your question, find a directory or a couple of them that
>>> have a lot of files.  run "ls -l" or "ls -lR" (the latter is
>>> recursive) in that directory.  Look at the output.
>>>
>>> sample:
>>>
>>> -rw-r--r-- 79 shawn users   37888 2005-12-04 14:36 X-mas list.xls
>>>
>>> The first field after the permissions us the number of links to the
>>> data in that file, 79 in this case.  That means there are 79 hard
>>> links to that file.  There will always be at least one.
>> Similar to the output of my find command, which was telling me how many
>> hard links it found for each file/directory on the system. As I said,
>> nothing that seemed to unusual.
>>
>>> Shawn
>>>
>> You mentioned TAR being a better option on a system with lots of hard
>> links. I'll give that a try and see if it's able to perform a successful
>> backup.
>>
>> I will point out that I have a 2nd BackupPC server that is backing up a
>> *different* machine running the cPanel system, which has many, many,
>> many more files/domains on it; and that is successfully backing up.
>>
>> I'll also try backing the client in question up to said 2nd BackupPC
>> server and see if that works.
>>
>>>
>>> On Tue, Nov 10, 2009 at 10:05 AM, Nick Bright  
>>> wrote:
>>>>> Shawn Perry wrote:
>>>>> Does this host have alot of hard links?
>>>>>
>>>> That's a good question that I have no idea how to answer.
>>>>
>>>>
>>> --
>>> Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
>>> trial. Simplify your report design, integration and deployment - and focus 
>>> on
>>> what you do best, core application coding. Discover what's new with
>>> Crystal Reports now.  http://p.sf.net/sfu/bobj-july
>>> ___
>>> BackupPC-users mailing list
>>> BackupPC-users@lists.sourceforge.net
>>> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
>>> Wiki:http://backuppc.wiki.sourceforge.net
>>> Project: http://backuppc.sourceforge.net/
>> --
>> Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
>> trial. Simplify your report design, integration and deployment - and focus on
>> what you do best, core application coding. Discover what's new with
>> Crystal Reports now.  http://p.sf.net/sfu/bobj-july
>> ___
>> BackupPC-users mailing list
>> BackupPC-users@lists.sourceforge.net
>> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
>> Wiki:http://backuppc.wiki.sourceforge.net
>> Project: http://backuppc.sourceforge.net/
>>

Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-10 Thread Nick Bright
Shawn Perry wrote:
> Did you use a disk deduplicator on the drive? Is there a directory
> with alot of files in it?  How many files are you backing up?

Sorry, I'm not familiar with a "deduplicator".

There aren't any directories with "a lot" of files any more than any of 
the other systems I'm backing up.

There are 202,984 files on the system.

> 
> If you have MANY hardlinks on a file system, rsync with the
> --hard-links option has a tendency to croak, leaving tar as the best
> option.

find / -printf "%n %i %p\n" | sort -nr

Doesn't seem to indicate that there is an unusually large number of hard 
links. The only stuff listed with a sizable amount of hard links appear 
to be directories that are all system stuff that would exists on all 
servers.

The system itself is a cPanel hosting server, and hasn't had anything 
special done to it. Let me put it this way - I didn't do anything to 
knowingly create "a lot" of hardlinks. I'm sure there's some, but 
probably not an unusually high number.

> 
> Dirvish has this same issue.
> 
> To answer your question, find a directory or a couple of them that
> have a lot of files.  run "ls -l" or "ls -lR" (the latter is
> recursive) in that directory.  Look at the output.
> 
> sample:
> 
> -rw-r--r-- 79 shawn users   37888 2005-12-04 14:36 X-mas list.xls
> 
> The first field after the permissions us the number of links to the
> data in that file, 79 in this case.  That means there are 79 hard
> links to that file.  There will always be at least one.

Similar to the output of my find command, which was telling me how many 
hard links it found for each file/directory on the system. As I said, 
nothing that seemed to unusual.

> 
> Shawn
> 

You mentioned TAR being a better option on a system with lots of hard 
links. I'll give that a try and see if it's able to perform a successful 
backup.

I will point out that I have a 2nd BackupPC server that is backing up a 
*different* machine running the cPanel system, which has many, many, 
many more files/domains on it; and that is successfully backing up.

I'll also try backing the client in question up to said 2nd BackupPC 
server and see if that works.

> 
> 
> On Tue, Nov 10, 2009 at 10:05 AM, Nick Bright  wrote:
>>> Shawn Perry wrote:
>>> Does this host have alot of hard links?
>>>
>> That's a good question that I have no idea how to answer.
>>
>>
> 
> --
> Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
> trial. Simplify your report design, integration and deployment - and focus on 
> what you do best, core application coding. Discover what's new with
> Crystal Reports now.  http://p.sf.net/sfu/bobj-july
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-10 Thread Nick Bright
> On Mon, Nov 9, 2009 at 4:57 PM, Nick Bright  wrote:
>> Les Mikesell wrote:
>>> Nick Bright wrote:
>>>> I've got a bit of a strange situation. My backuppc server, which
>>>> successfully backs up a half dozen or so machines, is unable to backup
>>>> one particular host.
>>>>
>>>> This host is configured the same as all of my other hosts which backup
>>>> successfully, but after running the rsync process for 8 to 10 hours, the
>>>> backup fails with this entry in the XferLOG:
>>>>
>>>> 
>>>> full backup started for directory /
>>>> Running: /usr/bin/ssh -q -x -l root cpanel /usr/bin/rsync --server
>>>> --sender --numeric-ids --perms --owner --group -D --links --hard-links
>>>> --times --block-size=2048 --recursive --ignore-times . /
>>>> Xfer PIDs are now 27778
>>>> Got remote protocol 1768191091
>>>> Fatal error (bad version): stdin: is not a tty
>>> I'm not sure it if causes this symptom, but one thing to check is that
>>> the remote shell for root can't output anything (like a
>>> message-of-the-day) before starting the specified program.
>>>
>> Thank you for your reply. I checked in to it, and determined that there
>> isn't anything being output by logging in to the backuppc system and "su
>> backuppc" then "ssh r...@cpanel":
>>
>> [backu...@backuppc ~]$ ssh r...@cpanel
>> Last login: Sat Sep 26 16:08:46 2009 from backuppc
>> r...@cpanel [~]#
>>
>> However, I did find that /etc/motd existed and was empty. I deleted it
>> to see what would happen on the next backup cycle. SSHing into the
>> client system under a normal user also does not result in any MOTD's or
>> warning messages.
>>
>>  - Nick
>>
>Shawn Perry wrote:
> Does this host have alot of hard links?
> 

That's a good question that I have no idea how to answer.


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup fails after running 8-10 hours

2009-11-09 Thread Nick Bright
Les Mikesell wrote:
> Nick Bright wrote:
>> I've got a bit of a strange situation. My backuppc server, which 
>> successfully backs up a half dozen or so machines, is unable to backup 
>> one particular host.
>>
>> This host is configured the same as all of my other hosts which backup 
>> successfully, but after running the rsync process for 8 to 10 hours, the 
>> backup fails with this entry in the XferLOG:
>>
>> 
>> full backup started for directory /
>> Running: /usr/bin/ssh -q -x -l root cpanel /usr/bin/rsync --server 
>> --sender --numeric-ids --perms --owner --group -D --links --hard-links 
>> --times --block-size=2048 --recursive --ignore-times . /
>> Xfer PIDs are now 27778
>> Got remote protocol 1768191091
>> Fatal error (bad version): stdin: is not a tty
> 
> I'm not sure it if causes this symptom, but one thing to check is that 
> the remote shell for root can't output anything (like a 
> message-of-the-day) before starting the specified program.
> 

Thank you for your reply. I checked in to it, and determined that there 
isn't anything being output by logging in to the backuppc system and "su 
backuppc" then "ssh r...@cpanel":

[backu...@backuppc ~]$ ssh r...@cpanel
Last login: Sat Sep 26 16:08:46 2009 from backuppc
r...@cpanel [~]#

However, I did find that /etc/motd existed and was empty. I deleted it 
to see what would happen on the next backup cycle. SSHing into the 
client system under a normal user also does not result in any MOTD's or 
warning messages.

  - Nick

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backup fails after running 8-10 hours

2009-11-09 Thread Nick Bright
I've got a bit of a strange situation. My backuppc server, which 
successfully backs up a half dozen or so machines, is unable to backup 
one particular host.

This host is configured the same as all of my other hosts which backup 
successfully, but after running the rsync process for 8 to 10 hours, the 
backup fails with this entry in the XferLOG:


full backup started for directory /
Running: /usr/bin/ssh -q -x -l root cpanel /usr/bin/rsync --server 
--sender --numeric-ids --perms --owner --group -D --links --hard-links 
--times --block-size=2048 --recursive --ignore-times . /
Xfer PIDs are now 27778
Got remote protocol 1768191091
Fatal error (bad version): stdin: is not a tty

Sent exclude: /proc
Sent exclude: /sys
Sent exclude: /mnt/nfs
fileListReceive() failed
Done: 0 files, 0 bytes
Got fatal error during xfer (fileListReceive failed)
Backup aborted by user signal
Not saving this as a partial backup since it has fewer files than the 
prior one (got 0 and 0 files versus 0)


I have been unable to locate any entries in any log files on the client 
system indicating what may be.

I'm hoping someone will have a tip on where to look or thoughts as to 
determining why the rsync is failing.

-- 
---
- Nick Bright
   Network Administrator
   Valnet Telecommunications, LLC
   Tel 888-332-1616 x 315
   Fax 620-332-1201

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with $Conf{BackupFilesExclude}

2009-01-21 Thread Nick Bright
Thanks, I'll give the link a read. I was more asking about the syntax 
than the contents It doesn't matter what's in there, if the syntax 
isn't correct!

---
- Nick Bright
   Network Administrator
   Valnet, LLC
   Tel 888-332-1616 x 315
   Fax 620-332-1201


Juergen Harms wrote:
> Have a look at
> http://backuppc.wiki.sourceforge.net/Common_backup_excludes
> 
> I find the linux excludes somewhat spartiate - different utilities place 
> backkup and temporary files at well hidden places. I just started 
> getting things together, my intial setup for my home directory 
> illustrates that there may be need for additional items to add to the 
> examples quoted above - the kde Trash file is a good example. And I have 
> not yet started hunting for temporary files produced by various Unix 
> utilities.
> 
> $Conf{BackupFilesExclude} = {
>harms => [
>   '/tmp',
>   '/.local/share/Trash',
>   '*~', '#*#'
>]
> };
> 
> 
> For windows, I do not know enough to point out useless stuff. My initial 
> exclude files are simply dictated by what I find in the error log - 
> files that backuppc could not access, because Skype, Thunderbird etc. 
> are using them - I like to have 0 errors unless backup really requires 
> attention. Starting from the example in the URL quoted above and adding 
> files I found in the error log, I added quite some Skype files, and had 
> to deal with the fact the different Mozilla version place the same kind 
> of file at /Application Data and at /Local Settings/Application Data - I 
> simple added pairs of items to catch both.
> 
> --
> This SF.net email is sponsored by:
> SourcForge Community
> SourceForge wants to tell your story.
> http://p.sf.net/sfu/sf-spreadtheword
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with $Conf{BackupFilesExclude}

2009-01-20 Thread Nick Bright
So, following that template I'd like to ask the list if I'm doing this 
right:

Linux Host:

$Conf{BackupFilesExclude} = {
   '/' => [
'/usr/local/vpopmail/domains',
'/usr/local/MPP/working',
'/proc',
'/var/qmail/queue',
'/backup'
   ]
};

Windows Host (using rsync configured to publish 'cDrive'):

$Conf{BackupFilesExclude} = {
  'cDrive' => ['/pagefile.sys','/Documents and 
Settings/','/WINNT/system32/config/','/WINNT/system32/drivers/']
};


---
- Nick Bright
   Network Administrator
   Valnet, LLC
   Tel 888-332-1616 x 315
   Fax 620-332-1201


Nick Bright wrote:
> That explains things! Seems like maybe the web interface needs a bit of 
> improvement in this regard.
> 
> In the web interface (which is how I added the exclusions), it simply 
> has a box for you to put a pathname to exclude and push "Add".
> 
> ---
> - Nick Bright
>Network Administrator
>Valnet, LLC
>Tel 888-332-1616 x 315
>Fax 620-332-1201
> 
> 
> Bowie Bailey wrote:
>> Nick Bright wrote:
>>> In my /etc/BackupPC/pc/host.pl file, I have the
>>> $Conf{BackupFilesExclude} directive configured as:
>>>
>>> $Conf{BackupFilesExclude} = {
>>>'/usr/local/vpopmail/domains' => [
>>>  ''
>>>],
>>>'/usr/local/MPP/working' => [
>>>  ''
>>>],
>>>'/proc' => [
>>>  ''
>>>],
>>>'/var/qmail/queue' => [
>>>  ''
>>>],
>>>'/backup' => [
>>>  ''
>>>]
>>> };
>>>
>>> This was configured through the web interface, on BackupPC v3.1.0.
>>> Backup method is "rsync".
>>>
>>> The backup runs, but it disregards my exclude directives and backs up
>>> the directories I'm telling it not to back up. Unfortunately, because
>>> of the first exclude line not being excluded, this is making the
>>> backup take about 20 hours to run!
>>>
>>> Any thoughts as to why these directories aren't being excluded?
>> Because you have not told it to exclude anything.  It works like this:
>>
>> $Conf{BackupFilesExclude} = {
>>'ShareName' => [
>>  'Exclusion'
>>],
>> };
>>
>> So you have a list of (probably invalid) share names, none of which have
>> any exclusions.  You probably want to do something like this:
>>
>> $Conf{BackupFilesExclude} = {
>>'/' => [
>>'/usr/local/vpopmail/domains',
>>'/usr/local/MPP/working',
>>'/proc',
>>'/var/qmail/queue',
>>'/backup',
>>]
>> };
>>
>> Assuming that you are backing up the root directory.
>>
> 
> --
> This SF.net email is sponsored by:
> SourcForge Community
> SourceForge wants to tell your story.
> http://p.sf.net/sfu/sf-spreadtheword
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with $Conf{BackupFilesExclude}

2009-01-19 Thread Nick Bright
That explains things! Seems like maybe the web interface needs a bit of 
improvement in this regard.

In the web interface (which is how I added the exclusions), it simply 
has a box for you to put a pathname to exclude and push "Add".

---
- Nick Bright
   Network Administrator
   Valnet, LLC
   Tel 888-332-1616 x 315
   Fax 620-332-1201


Bowie Bailey wrote:
> Nick Bright wrote:
>> In my /etc/BackupPC/pc/host.pl file, I have the
>> $Conf{BackupFilesExclude} directive configured as:
>>
>> $Conf{BackupFilesExclude} = {
>>'/usr/local/vpopmail/domains' => [
>>  ''
>>],
>>'/usr/local/MPP/working' => [
>>  ''
>>],
>>'/proc' => [
>>  ''
>>],
>>'/var/qmail/queue' => [
>>  ''
>>],
>>'/backup' => [
>>  ''
>>]
>> };
>>
>> This was configured through the web interface, on BackupPC v3.1.0.
>> Backup method is "rsync".
>>
>> The backup runs, but it disregards my exclude directives and backs up
>> the directories I'm telling it not to back up. Unfortunately, because
>> of the first exclude line not being excluded, this is making the
>> backup take about 20 hours to run!
>>
>> Any thoughts as to why these directories aren't being excluded?
> 
> Because you have not told it to exclude anything.  It works like this:
> 
> $Conf{BackupFilesExclude} = {
>'ShareName' => [
>  'Exclusion'
>],
> };
> 
> So you have a list of (probably invalid) share names, none of which have
> any exclusions.  You probably want to do something like this:
> 
> $Conf{BackupFilesExclude} = {
>'/' => [
>'/usr/local/vpopmail/domains',
>'/usr/local/MPP/working',
>'/proc',
>'/var/qmail/queue',
>'/backup',
>]
> };
> 
> Assuming that you are backing up the root directory.
> 

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Problem with $Conf{BackupFilesExclude}

2009-01-19 Thread Nick Bright
In my /etc/BackupPC/pc/host.pl file, I have the 
$Conf{BackupFilesExclude} directive configured as:

$Conf{BackupFilesExclude} = {
   '/usr/local/vpopmail/domains' => [
 ''
   ],
   '/usr/local/MPP/working' => [
 ''
   ],
   '/proc' => [
 ''
   ],
   '/var/qmail/queue' => [
 ''
   ],
   '/backup' => [
 ''
   ]
};

This was configured through the web interface, on BackupPC v3.1.0. 
Backup method is "rsync".

The backup runs, but it disregards my exclude directives and backs up 
the directories I'm telling it not to back up. Unfortunately, because of 
the first exclude line not being excluded, this is making the backup 
take about 20 hours to run!

Any thoughts as to why these directories aren't being excluded?

-- 
---
- Nick Bright
   Network Administrator
   Valnet, LLC
   Tel 888-332-1616 x 315
   Fax 620-332-1201

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/