Sten > > Thanks for the report, but I suspect that's been fixed. > > > > [4.0s] 2000-05-20 > > * rm no longer segfaults on certain very deep hierarchies > > > > Which version of rm are you using?
> I don't know but, here is what I get: But you did know. You said: > # rm --version > rm (GNU fileutils) 4.0.36 The version of rm you are using is 4.0.36 from the GNU fileutils package. And that happens to be later than 4.0s (equivalent to 4.0.19 in the new numbering) that the fix was reported fixed in. It is possible that a glibc problem is occurring on your system. > [Now we create the dirs] > # seq 1 90000 |\ > > (while read nr; do mkdir z ; sync ; cd z ; echo -n "${nr}."; done) > [NOTE: > alot of complaints from bash et.al, totally breaks down at 19606] > > [And now we try to remove them] > # rm -rf z > Segmentation fault (core dumped) An excellent case example! Thank you very much for including it. When I try it on linux-2.2.18 with glibc-2.1.92 I get the following while creating the directories. cd: could not get current directory: getcwd: cannot access parent directories: Numerical result out of range find . -print | tail -n 1 | wc -c 4202 On hpux 10.20 I also get something similar but at a different length. And the message does seem more appropriate and correct. cd: could not get current directory: getcwd: cannot access parent directories: File name too long find . -print | tail -n 1 | wc -c 1038 I believe I am running into the filesystem max path length in both cases. It just can't make a path any deeper at that point. But using rm version 4.1 I cannot create any failures removing that deep path on either system. Bob _______________________________________________ Bug-fileutils mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/bug-fileutils