I'm trying to find all the HTML documents in my website and change a
line in them from  a link to a different site to an absolute link on the
current server. The perl one-liner I wrote is (Jeez, am I proud of
this):
find -iname "*.*htm*" -o -iname "*.stm"|xargs egrep -l
"centernet\.jhuccp\.org/cgi-bin/mail2friend|cgi\.jhuccp\.org/cgi-bin/mail2friend"|xargs
perl -pi~ -e"s?http://.*\.jhuccp\.org(/cgi-bin/mail2friend)?\1?g;"

It seems to run fine and changes many files, but when I go searching
for the string that was supposed to be changed, I keep finding more
file. Many were changed correctly, but some were not.

It strikes me that maybe perl can't take too many arguments at once.
There are options to the xarg command that allow no more than so many
arguments at a time to be passed. Is this what's wrong? What should I
set the number of arguments to?

Thanks for trying to help me with this.

-Kevin Zembower

-----
E. Kevin Zembower
Unix Administrator
Johns Hopkins University/Center for Communications Programs
111 Market Place, Suite 310
Baltimore, MD  21202
410-659-6139

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to