On Fri, 2005-09-16 at 14:03 +1200, Nick Rout wrote:
> On Fri, 16 Sep 2005 09:43:50 +0800
> Ow Mun Heng wrote:
>
> > That's what I just did..
> > $ cat cron_fetch_stock.sh
> > #!/bin/bash
> > #
> > TEMPFILE="/tmp/file.$$"
> > PORTFOLIO_SCRIPT=$HOME/scripts/portfolio.sh
> >
> > $PORTFOLIO_SCRIPT --html > $TEMPFILE 2>/dev/null>&1 && mail -a
> > "Content-Type: text/html" -s "Stock Quotes `date +%c`" < $TEMPFILE
> > gentoo && rm $TEMPFILE
>
>
> whats wrong with :
>
> $PORTFOLIO_SCRIPT --html 2>/dev/null | mail -a "Content-Type: text/html" -s
> "Stock Quotes `date +%c`"
>
> Assuming $PORTFOLIO_SCRIPT outputs to stdout (it should).
That worked too. I could've sworn I tried that but it didn't work. Oh
well, It works now anyway :-)
More than 1 way to skin a cat.
The other issue with my script is the actual script that parses
screen-scrapes. Initial script only went to get the "Last Price" now, I
would like to add the "Day Range"
last_price()
{
value="$(lynx -dump "$url$symbol" | grep 'Last price:' | \
awk -F: 'NF > 1 && $(NF) != "N/A" { print $(NF) }' )"
}
day_range()
{
day_range="$(lynx -dump "$url$symbol" | grep 'Low \& High:' | \
awk -F: 'NF > 1 && $(NF) != "N/A" { print $(NF) }' )"
}
The above is in-efficient because I need to call the script to get the
page 2 times.
Doing a
lynx -dump "$url$symbol" | egrep -i '(Last Price|Low \& High)'| awk -F:
'NF > 1 && $(NF) != "N/A" { print $(NF) }'
will work, but then, there will be 2 values associated with it :
7.35
7.127 - 7.38
Can anyone help with a better awk script so that each value will be
associated with each line?
eg:
last_price=7.35
day_range=7.127 - 7.38
w/o actually piping the lynx output to a file (actually, that would be
the easy way)
On the other hand, how does one use awk for multigreps like egrep '(pop|
test)'
I've tried variation of
1. awk "/Low & High/"
2. awk "/Low & High/" && /Last/
3. awk '{"/Low & High/" && /Last/}'
all of which doesn't work except for No. 1
>
--
Ow Mun Heng
Gentoo/Linux on DELL D600 1.4Ghz 1.5GB RAM
98% Microsoft(tm) Free!!
Neuromancer 10:47:05 up 2 days, 23:21, 7 users, load average: 1.43,
1.47, 1.15
--
[email protected] mailing list