Rsync passing over atimes

2001-01-25 Thread Michael James
There was some discussion about rsync preserving atimes back in August last year that made the point that you have to choose between ctime and atime and ctime is often more important. But I have a situation where the behaviour of the --backup and --checksum options is causing rsync to read f

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread David Bolen
I previously wrote: > Well, as with any dynamic system, I'm not sure there's a totally > simple answer to the overall allocation, as the tree structure created Oops, this slipped through editing - as I wrote up the rest of the note I didn't actually find a tree structure (I earlier thought there

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread David Bolen
Dave Dykstra [[EMAIL PROTECTED]] writes: > No, that behavior should be identical with the --include-from/exclude '*' > approach; I don't believe rsync uses any memory for excluded files. Actually, I think there's an exclude_struct allocated somewhere per file (looks like 28 bytes or so), but the

Re: Write error

2001-01-25 Thread Dave Dykstra
On Thu, Jan 25, 2001 at 06:42:33PM +0100, [EMAIL PROTECTED] wrote: > > On 25.01.2001 17:59:53 Dave Dykstra wrote: > > > On Wed, Jan 24, 2001 at 01:21:14PM +0100, [EMAIL PROTECTED] > wrote: > > > Hi! > > > When i try to sync the content of two directories i receive several > > > different errors

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread David Bolen
Lenny Foner [[EMAIL PROTECTED]] writes: > While we're discussing memory issues, could someone provide a simple > answer to the following three questions? Well, as with any dynamic system, I'm not sure there's a totally simple answer to the overall allocation, as the tree structure created on the

Re: The "out of memory" problem with large numbers of files

2001-01-25 Thread Dave Dykstra
On Thu, Jan 25, 2001 at 11:47:32AM -0500, Lenny Foner wrote: > While we're discussing memory issues, could someone provide a simple > answer to the following three questions? > (a) How much memory, in bytes/file, does rsync allocate? Andrew Tridgell said 10-14 bytes per file in http://lists.

Re: are redhat updates rsyncable

2001-01-25 Thread Alberto Accomazzi
In message <[EMAIL PROTECTED]>, Harry Putnam writes: > Sorry to reprint this request for information but I guess I want more > handholding here. > > [...] > > > > > "Michael H. Warfield" <[EMAIL PROTECTED]> writes: > > > > > rsync ftp.wtfo.com:: > > > > Getting this far ... works as adver

Re: Write error

2001-01-25 Thread Georg . Bischof
On 25.01.2001 17:59:53 Dave Dykstra wrote: > On Wed, Jan 24, 2001 at 01:21:14PM +0100, [EMAIL PROTECTED] wrote: > > Hi! > > When i try to sync the content of two directories i receive several > > different errors on recurrent attempts: > > > > rsync -al -e ssh --delete -v . user@x:/home/ww

Re: How to exclude binary executables?

2001-01-25 Thread Kurt Seel
Remko Scharroo wrote: First of all, I love rsync. After using mirror and rdist, rsync really does it well and fast!  I second this, this is a gem that should be in every sysadmin's toolbox, what it has allowed me to accomplish here is nothing short of remarkable!   But there is one feature I miss

Re: rsync exits with 'remsh' error from script

2001-01-25 Thread Dave Dykstra
On Wed, Jan 24, 2001 at 02:24:37PM -0600, Denmark B. Weatherburn wrote: > Hi Listers, > > I hope this posting qualifies for your acceptance. > I'm working on a Korn shell script to using rsync to synchronize several Sun > hosts running Solaris 2.7. > Below is the error message that I get. I'm not

Re: are redhat updates rsyncable

2001-01-25 Thread Harry Putnam
[EMAIL PROTECTED] writes: > > > > > > Anyone here know if redhat linux updates can be rsynced? > > > > If so, is it necessary to have rsh installed. > > > > I guess what I really need is to see the commands necessary to connect > > to a redhat `updates' ftp site with rsync. If it is even pos

Re: are redhat updates rsyncable

2001-01-25 Thread Harry Putnam
Sorry to reprint this request for information but I guess I want more handholding here. [...] > > "Michael H. Warfield" <[EMAIL PROTECTED]> writes: > > > rsync ftp.wtfo.com:: > Getting this far ... works as advertised. But beyond that point, how to actually get to the files and collect t

Re: Solaris 8 rsync problem

2001-01-25 Thread Dave Dykstra
On Wed, Jan 24, 2001 at 08:30:50AM -0800, Adam Wilson wrote: > > Hello, I am seeing the following problems when trying > to perform rsync between a Sun running Solaris 8 and a > Redhat Linux box. rsh is already set up to allow > remote logins, file copies, etc. > > > [cable@galadriel]{1348}% r

The "out of memory" problem with large numbers of files

2001-01-25 Thread Lenny Foner
While we're discussing memory issues, could someone provide a simple answer to the following three questions? (a) How much memory, in bytes/file, does rsync allocate? (b) Is this the same for the rsyncs on both ends, or is there some asymmetry there? (c) Does it matter whether pushing or pulli

Re: rsync 2.4.6 hangs in waitid() on Solaris 2.6 system

2001-01-25 Thread John Stoffel
Dave> Try using "-W" to disable the rsync rolling checksum algorithm Dave> when copying between two NFS mounts, because that causes extra Dave> NFS traffic. Rsync's algorithm is optimized for minimizing Dave> network traffic between its two halves at the expense of extra Dave> local access and i

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread John Stoffel
Martin> Outstanding! Thank you very much, Dave. Martin> I'll discuss the upgrade with my boss tomorrow and give it a Martin> try then. Getting back to you guys afterwards to tell you how Martin> it went. I suspect that you'll still run into this problem with waitid() no matter what, since it s

2.4.6 hang -- strace info

2001-01-25 Thread Eric Whiting
I have a setup that is showing very predictable rsync hangs. SETUP - System: Linux 2.4.0+, rsync 2.4.6, 1Ghz T-bird (No OC), 256M, 45G IDE, 10G IDE Rsyncing an old 3.6G vfat partition from the 10G disk to the 45G disk. The sync runs great until after it finishes a certain file. Same file ev

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread Schmitt, Martin
> Ah, rsync 2.4.1 had definite known problems with SSH > transport. Upgrade > to 2.4.6 and your problem should be solved. It's very easy to compile > from source. Get it from rsync.samba.org and run configure and make. > You can also use the solaris binary that I maintain on that web site; > I

Re: The "out of memory" problem with large numbers of files

2001-01-25 Thread Dave Dykstra
On Thu, Jan 25, 2001 at 04:35:11PM +0100, Schmitt, Martin wrote: > Dave, > > thanks for your reply. > > > No, it's not an out of memory problem but it is like one of > > the numerous > > different kinds of hangs that people experience. Are you > > copying between > > two places on the same sy

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread Schmitt, Martin
Dave, thanks for your reply. > No, it's not an out of memory problem but it is like one of > the numerous > different kinds of hangs that people experience. Are you > copying between > two places on the same system or are you copying to another > system? What > kinds of network transport is

Re: How to exclude binary executables?

2001-01-25 Thread Dave Dykstra
On Thu, Jan 25, 2001 at 04:08:56PM +0100, Remko Scharroo wrote: > First of all, I love rsync. After using mirror and rdist, rsync really > does it well and fast! > > But there is one feature I miss in rsync that rdist has: there seems to > be no way to exclude binary executables from being copied

RE: The "out of memory" problem with large numbers of files

2001-01-25 Thread Schmitt, Martin
John, thanks for your reply. > Welcome to the club! Is this filesystem local or NFS mounted? And > how are you sending the data to another filesystem? Also, which > version of rsync are you using? I don't know if this is the correct wording, since I'm a clueless Solaris newbie, but judging

Re: The "out of memory" problem with large numbers of files

2001-01-25 Thread Dave Dykstra
On Thu, Jan 25, 2001 at 09:21:27AM -0500, John Stoffel wrote: > > Martin> I'm dealing with a big and ugly filesystem that looks like this: > > Martin> $ du -sk . > Martin> 1526500 . > Martin> $ find . -depth -print | wc -l > Martin> 152221 > > Welcome to the club! Is this filesystem local or

How to exclude binary executables?

2001-01-25 Thread Remko Scharroo
First of all, I love rsync. After using mirror and rdist, rsync really does it well and fast! But there is one feature I miss in rsync that rdist has: there seems to be no way to exclude binary executables from being copied. Of course, if you know the file names you can, but if you don't you can'

Re: The "out of memory" problem with large numbers of files

2001-01-25 Thread John Stoffel
Martin> I'm dealing with a big and ugly filesystem that looks like this: Martin> $ du -sk . Martin> 1526500 . Martin> $ find . -depth -print | wc -l Martin> 152221 Welcome to the club! Is this filesystem local or NFS mounted? And how are you sending the data to another filesystem? Also, w

The "out of memory" problem with large numbers of files

2001-01-25 Thread Schmitt, Martin
Hello everyone! I'm dealing with a big and ugly filesystem that looks like this: $ du -sk . 1526500 . $ find . -depth -print | wc -l 152221 rsync seems to run into some 20M limit on this Slowaris 2.6 machine. CPU usage goes down to zero, 20M memory allocation, no activity from rsync. This lo