Hi all,

I was looking for a command-line program that takes an HTML 
file and list all the URLs in it, 1 per line, in plain text. 
urlview almost does this, but the output is screen oriented 
(i.e. uses curses).

I thought this would be a very easy thing to find, but it took 
me a while to locate a package called html-xml-utils on w3.org,
which includes a program called wls to do this.

I must be missing something obvious. What do people commonly 
use when they want to extract URLs from a file?

Thanks,
Mandar.

--
To unsubscribe, send mail to [EMAIL PROTECTED] with the body
"unsubscribe ilug-cal" and an empty subject line.
FAQ: http://www.ilug-cal.org/node.php?id=3

Reply via email to