On 12/4/13, Matt Hope <[email protected]> wrote:
> Random tips:
cp -t argument sometimes useful
rsync -avu src/ host://dest/ might be preferable
when find's -print0 also use:
[e]grep's -z and -Z (or -zZ) options
xarg's -0 option
Something I cut and pasted off the internet years ago:
Way to solve it if you can't use GNU utilities:
find -name "*.txt" | sed 's/"/\\"/g;s/.*/"&"/' | xargs grep "whatever"
This escapes out all the double quotes in the pathname and
then wraps the entire pathname in double quotes.
And if your directory is excessively large (this was a reminder to
myself years ago, probably no longer applicable with the modern
filesystems and proper direntry hashing):
For very large directories (12K files), for some reason piping find
into sed, can cause each iteration (ie. each file) to take a second or
two each. Bizzare. So use the following:
ls > zzz; then do the following; cat zzz |xargs sed -i -e
"s%http://wiki.jboss.org/wiki/Wiki.jsp?page=%%g;" -e "s%<base
href=\"http://www.jboss.com/\"/>%%g;"
If you end up on your char fishing expedition, you might prefer to use
the "ls > tmp.file" form, followed by some sed or perl.
Good luck.
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html