Is there a way to suppress the creation of a .html file, and instead
redirect the output to stdout? I want to issue a wget command, and just
grep for some data that is contained in the resulting html file.
Maybe I am missing a command line option to do this, but I don't see one.
shit, i'd just use lynx or links to do
links -source www.washpost.com
but wget could do
wget -O /dev/stdout www.washpost.com
On Wed, 9 Jul 2003, Jerry Coleman wrote:
Is there a way to suppress the creation of a .html file, and instead
redirect the output to stdout? I want to issue a wget
What's your problem?
That has to be the least informative email I've seen in a long time.
tjc
(apologies for top-posting in reply)
On Thu, Jul 03, 2003 at 03:20:28PM +0200, J K wrote:
FUCK YOU!!!
From: Toby Corkindale [EMAIL PROTECTED]
To: Jim Ennis [EMAIL PROTECTED]
This should do the trick:
wget -O /dev/stdout http://blah.com | grep whatever
-Rob Hinst
- Original Message -
From: Jerry Coleman [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Wednesday, July 09, 2003 1:03 PM
Subject: Capture HTML Stream
Is there a way to suppress the
we're all used to J K's personality, now.
On Wed, 9 Jul 2003, Toby Corkindale wrote:
What's your problem?
That has to be the least informative email I've seen in a long time.
tjc
(apologies for top-posting in reply)
On Thu, Jul 03, 2003 at 03:20:28PM +0200, J K wrote:
FUCK
Aaron S. Hawley [EMAIL PROTECTED] said:
but wget could do
wget -O /dev/stdout www.washpost.com
On DOS/Windows too? I think not. There must be a better way.
--gv
try also:
wget -O - www.washpost.com
On Wed, 9 Jul 2003, Gisle Vanem wrote:
Aaron S. Hawley [EMAIL PROTECTED] said:
but wget could do
wget -O /dev/stdout www.washpost.com
On DOS/Windows too? I think not. There must be a better way.
--gv