Re: Capture HTML Stream
On Wed, 9 Jul 2003, Gisle Vanem wrote: > "Aaron S. Hawley" <[EMAIL PROTECTED]> said: > > > but wget could do > > > > wget -O /dev/stdout www.washpost.com > > On DOS/Windows too? I think not. There must be a better way. I just checked this on the 1.8.2 port of wget to DOS with DJGPP. 'wget -O - "http://whatever.com/url";' works just fine. Doug -- Doug Kaufman Internet: [EMAIL PROTECTED]
Re: Capture HTML Stream
try also: wget -O - www.washpost.com On Wed, 9 Jul 2003, Gisle Vanem wrote: > "Aaron S. Hawley" <[EMAIL PROTECTED]> said: > > > but wget could do > > > > wget -O /dev/stdout www.washpost.com > > On DOS/Windows too? I think not. There must be a better way. > > --gv
Re: Capture HTML Stream
"Aaron S. Hawley" <[EMAIL PROTECTED]> said: > but wget could do > > wget -O /dev/stdout www.washpost.com On DOS/Windows too? I think not. There must be a better way. --gv
Re: Capture HTML Stream
This should do the trick: wget -O /dev/stdout http://blah.com | grep "whatever" -Rob Hinst - Original Message - From: "Jerry Coleman" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Wednesday, July 09, 2003 1:03 PM Subject: Capture HTML Stream > > > > > Is there a way to suppress the creation of a .html file, and instead > redirect the output to stdout? I want to issue a wget command, and just > grep for some data that is contained in the resulting html file. > > Maybe I am missing a command line option to do this, but I don't see one. > >
Re: Capture HTML Stream
shit, i'd just use lynx or links to do links -source www.washpost.com but wget could do wget -O /dev/stdout www.washpost.com On Wed, 9 Jul 2003, Jerry Coleman wrote: > Is there a way to suppress the creation of a .html file, and instead > redirect the output to stdout? I want to issue a wget command, and just > grep for some data that is contained in the resulting html file. > > Maybe I am missing a command line option to do this, but I don't see one. -- GNU Press www.gnupress.org
Capture HTML Stream
Is there a way to suppress the creation of a .html file, and instead redirect the output to stdout? I want to issue a wget command, and just grep for some data that is contained in the resulting html file. Maybe I am missing a command line option to do this, but I don't see one.