Op donderdag 09 augustus 2007, schreef Alex Zbyslaw:
> Bram Schoenmakers wrote:
> ># /sbin/dump -0uan -L -h 0 -f - / | /usr/bin/bzip2 | /usr/bin/ssh
> >[EMAIL PROTECTED] \
> > dd of=/backup/webserver/root.0.bz2
> bzip2 is darned slow and not always much better than gzip -9. It might
> be that ssh is just timing out in some way (I've seen that but not with
> ethernet dumps specifically). Can you try the test using gzip -9
> instead of bzip? If that works, then look for ssh options that affect
> timeouts, keepalives etc. In particular, ServerAliveInterval 60 in a
> .ssh/config stopped xterm windows dying on me to certain hosts. YMMV :-(
> If you have the disk space then you could try without any compression at
> all; or try doing the compression remotely:
> /sbin/dump -0 -a -C 64 -L -h 0 -f - / | \
> /usr/local/bin/ssh [EMAIL PROTECTED]
> "gzip -9 > /backup/webserver/root.0.gz"
> Nikos Vassiliadis wrote:
> >1) Can you dump the file locally?
> >2) Is scp working?
> If you can write (and compress if short of disk space) the dump locally and
> try an scp to your remote host as Nikos is suggesting, that will narrow
> down the problem a bit. Any other large file will do: doesn't have to be a
As I wrote in my initial mail:
* Downloading the very same big file over SCP causes problems too, below some
SCP debug output. The connection drops quickly after it gained a reasonable
Read from remote host office.example.com: Connection reset by peer
debug1: Transferred: stdin 0, stdout 0, stderr 77 bytes in 103.3
debug1: Bytes per second: stdin 0.0, stdout 0.0, stderr 0.7
debug1: Exit status -1
That was just a file generated with 'dd if=/dev/zero of=zeroes bs=1024k
count=200' . So no, SCP doesn't work.
I haven't tried gzip -9 yet, although it looks like a workaround than a
solution to the real problem.
You can contact me directly on Jabber with [EMAIL PROTECTED]
email@example.com mailing list
To unsubscribe, send any mail to "[EMAIL PROTECTED]"