> I am trying to search a big dir (4299 files), each file name is very
> long like 
> 1076206852.M830570P10873V0000000000003A01I00000033_2.www.rout.co.nz,S=337:2,S
> 
> (its a maildir)

Sure, it *had* to be a maildir (you basically have to use a script to
search through them).

It's also easy to reach the limit when using long paths, like
*/*/*/*/dir-with-many-files.

As pointed out, the only solution is a construction of find | xargs.
What nobody pointed out is that these commands are inherently unsafe. In
typical unix (posix?) we'll-live-in-the-sixties-forever fashion, xargs
chops its input at whitespace, *NOT* at newlines. Any IMHO useful
solution also must not barf on any character in filenames, including
such exciting things as ampersands, newlines, etc. The GNU find/xargs
can be switched to use nulls as terminators (this is no longer
portable). A portable solution is a major PITA. Needing a solution for
this many years ago, I wrote a script fxargs which does this at least on
Solaris and Linux. Saves a lot of typing.

Note that all this find|xargs business comes with a limitation: there
can be no further arguments following the list of files. The obvious
commands which fail this condition are cp and rsync - if you need to
copy files, you're SOOL (you have to copy directories instead).

> know whether this is to do with the number of arguments, or the total
> number of characters in the argument list?

There is a system-imposed limit on both. Recompiling does not solve the
problem, it only postpones it.

Volker

-- 
Volker Kuhlmann                 is possibly list0570 with the domain in header
http://volker.dnsalias.net/             Please do not CC list postings to me.

Reply via email to