scott comer <[EMAIL PROTECTED]> [2002-11-08 11:58:43 -0600]: > generally, this might be a shell problem rather than an rm problem, but > here it is:
Actually it is an operating system limitation. It is not a problem for either bash or for rm or for any other command. > using bash on some recent version of cygwin build: > ufind -name \*.class > x > rm `cat x` > > this will fail if x is too large. You can see how this is by doing: echo `cat x` That will show you the command that rm is getting. Except that the echo will also fail since you are out of arg space. :-) > there is absolutely no reason for it to > fail. :-) Laughing at this one. There is an old saying. Spit in one hand and wish in the other. See which one fills up faster? > maybe if x was, say, 50 meg or something. but 49k is not > unreasonable. Actually in the old days it was a 20k limit. That is all that the standards require. Today most operating systems make this either large or unlimited. The standards require it to be at least 20k and that is all they say about it. You mentioned cygwin and I know nothing about cygwin. But on a unix os I would do this to see the limit. getconf ARG_MAX On my linux machine that is 131072 and on my hpux machine that is 2048000. > short of fixing bash, This is not a bash problem. Please see this reference for details. Look for "Argument list too long". http://www.gnu.org/software/fileutils/doc/faq/ > perhaps a parameter to rm which says the > following file is a list of files to be removed: > rm -l x > > (that's dash-ell). > > this would be more efficient, anyway. what i do now is horribly inefficient: > ufind -name \*.class -exec rm \{} \; [You shouldn't need to quote the {} unless you have a really strange shell.] But for a very efficient way to do this use find plus xargs. This works efficiently for any filename. find ./where/ever -name '*.class' -print0 | xargs -0 rm Bob _______________________________________________ Bug-fileutils mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/bug-fileutils