[EMAIL PROTECTED] a]$ i=0
[EMAIL PROTECTED] a]$ while ((i<1000)); do touch $i; ((i++)); done
[EMAIL PROTECTED] a]$ time find . -type f -exec rm -f {} \;
real 0m0.949s
user 0m0.221s
sys 0m0.725s
[EMAIL PROTECTED] a]$ i=0
[EMAIL PROTECTED] a]$ while ((i<1000)); do touch $i; ((i++)); done
[EMAIL PROTECTED] a]$ time find . -type f | xargs rm -f
real 0m0.036s
user 0m0.003s
sys 0m0.033s
Forking and exec'ing isn't effortless.
[EMAIL PROTECTED] a]$ time ruby -e '1000.times { system("/bin/date >/dev/null") }'
real 0m2.660s
user 0m0.956s
sys 0m1.676s
[EMAIL PROTECTED] a]$ time ruby -e 'system("/bin/date")'
Thu Apr 13 11:02:36 EDT 2006
real 0m0.011s
user 0m0.008s
sys 0m0.003s
-b.
On 4/12/06, Jeff Waugh <[EMAIL PROTECTED]> wrote:
<quote who="Peter Rundle">
> Maybe but.... given that rm is the command in this case, the time to
> remove the files from disk is orders of magnitude greater than the time to
> load and execute the program.
You'd be surprised - benchmark it. (Hint: You're adding the overhead of
bringing up rm for *every* *single* file. Doesn't matter that the time to
remove files from disk takes longer - you're adding time to every single
cycle.)
> Also what happens if find returns a million file names? xargs can't put
> them all on a single command line else the dreaded "too many args" error
> will occur. So xargs must have some smarts in it to call the command
> multiple times, passing it 1,2,...N args at a time?
> So it obviously breaks the args up into chunks and invokes the command
> multiple times in any case.
That's actually the main point of xargs. It just happens to do things faster
by reducing the number of times it invokes the target command. If you want
to do the same thing to 2000 files - in the majority of cases, you're better
off doing it 10 times to 200 files than 2000 times to 1 file. Of course, you
can specify how many files xargs will operate on in one invocation too, if
there is some kind of arbitrary limit you must adhere to.
Also, you would have to write something extremely fiendishly clever to win a
SLUG Shell Scripting Smackdown that included find -exec.
- Jeff
--
LinuxWorldExpo: Johannesburg, South Africa http://www.linuxworldexpo.co.za/
"Gah. Out of coffee. Shall think whilst auto-caffeinating." - Telsa
Gwynne
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
-- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
