on 2009-11-09 17:57:11 -0600 [Re: [BackupPC-users] Backup
fails after running 8-10 hours]:
Les Mikesell wrote:
Nick Bright wrote:
Got remote protocol 1768191091
Fatal error (bad version): stdin: is not a tty
note the error message here. stdin: is not a tty.
[...]
Thank you for your reply
Hi,
Nick Bright wrote on 2009-11-09 17:57:11 -0600 [Re: [BackupPC-users] Backup
fails after running 8-10 hours]:
Les Mikesell wrote:
Nick Bright wrote:
[...]
full backup started for directory /
Running: /usr/bin
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Holger Parplies wrote:
Nick Bright wrote on 2009-11-09 17:57:11 -0600 [Re: [BackupPC-users] Backup
fails after running 8-10 hours]:
Les Mikesell wrote:
Nick Bright wrote:
Got remote protocol 1768191091
Fatal error (bad version): stdin
I didn't try it with rsyncd.
- Nick
Shawn Perry wrote:
Does it work with rsyncd?
On Tue, Nov 10, 2009 at 2:28 PM, Nick Bright nick.bri...@valnet.net wrote:
The backup successfully completed with the tar method.
Shawn Perry wrote:
That sounds like a different sort of problem then.
A
On Tuesday 10 November 2009 19:57:11 Nick Bright wrote:
The system itself is a cPanel hosting server, and hasn't had anything
special done to it. Let me put it this way - I didn't do anything to
knowingly create a lot of hardlinks. I'm sure there's some, but
probably not an unusually high
Did you use a disk deduplicator on the drive? Is there a directory
with alot of files in it? How many files are you backing up?
If you have MANY hardlinks on a file system, rsync with the
--hard-links option has a tendency to croak, leaving tar as the best
option.
Dirvish has this same issue.
Shawn Perry wrote:
Did you use a disk deduplicator on the drive? Is there a directory
with alot of files in it? How many files are you backing up?
Sorry, I'm not familiar with a deduplicator.
There aren't any directories with a lot of files any more than any of
the other systems I'm backing
That sounds like a different sort of problem then.
A deduplicator is a program that walks through a filesystem and finds
identical files, and then hard links them together to save space.
On Tue, Nov 10, 2009 at 12:57 PM, Nick Bright nick.bri...@valnet.net wrote:
Shawn Perry wrote:
Did you use
The backup successfully completed with the tar method.
Shawn Perry wrote:
That sounds like a different sort of problem then.
A deduplicator is a program that walks through a filesystem and finds
identical files, and then hard links them together to save space.
On Tue, Nov 10, 2009 at
Does it work with rsyncd?
On Tue, Nov 10, 2009 at 2:28 PM, Nick Bright nick.bri...@valnet.net wrote:
The backup successfully completed with the tar method.
Shawn Perry wrote:
That sounds like a different sort of problem then.
A deduplicator is a program that walks through a filesystem and
I've got a bit of a strange situation. My backuppc server, which
successfully backs up a half dozen or so machines, is unable to backup
one particular host.
This host is configured the same as all of my other hosts which backup
successfully, but after running the rsync process for 8 to 10
Nick Bright wrote:
I've got a bit of a strange situation. My backuppc server, which
successfully backs up a half dozen or so machines, is unable to backup
one particular host.
This host is configured the same as all of my other hosts which backup
successfully, but after running the rsync
Les Mikesell wrote:
Nick Bright wrote:
I've got a bit of a strange situation. My backuppc server, which
successfully backs up a half dozen or so machines, is unable to backup
one particular host.
This host is configured the same as all of my other hosts which backup
successfully, but
Does this host have alot of hard links?
On Mon, Nov 9, 2009 at 4:57 PM, Nick Bright nick.bri...@valnet.net wrote:
Les Mikesell wrote:
Nick Bright wrote:
I've got a bit of a strange situation. My backuppc server, which
successfully backs up a half dozen or so machines, is unable to backup
one
14 matches
Mail list logo