Matthew Hannigan wrote:
Yeah, not a bad call - it follows the 'correct first, fast later
/ premature optimization is the root of all evil' principle.
Obviously xargs is a "better" solution based on the unix concept of only
doing one thing and doing it well. xargs adds the capability to any data
stream where as -exec is tied to find. I just reckon that people are
just a bit quick to diss -exec. Consider these two cmds;
find . -exec echo {} \;
find . -print0 | xargs -0 echo
Don't produce the same result.
Still, it can be a lot slower to use -exec.
Maybe but.... given that rm is the command in this case, the time to
remove the files from disk is orders of magnitude greater than the time
to load and execute the program.
Also what happens if find returns a million file names? xargs can't put
them all on a single command line else the dreaded "too many args" error
will occur. So xargs must have some smarts in it to call the command
multiple times, passing it 1,2,...N args at a time? Speculation on my
part, I don't know this for a fact but man xargs says;
If any invocation of the command exits with a status of 255,
xargs will stop immediately without reading any further input.
So it obviously breaks the args up into chunks and invokes the command
multiple times in any case.
The stopping on error could be a "good thing" or a "bad thing" depending
on what you are trying to do. In the original example the find statement
will pick up directory names, hence passing it to "xargs rm -f" should
cause an error straight up as the first name found will be a directory.
Hence the find with xargs method won't clear the smtpd logs which was
Terry's intention. (Or will it as what error code does a failed rm
return?) On the other hand, -exec will continue on blindly executing the
given command for each filename passed to it, which may or may not be "a
good thing" depending on circumstance.
P.
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html