On Wed, Feb 15, 2012 at 3:42 PM, David Korn <[email protected]> wrote:

> > how about using set -o pipefail ?
>
> I suspect that the user who posted the query wanted the pipeline to complete 
> as soon as there was a failure.

yes, pretty much. the real task i'm looking at scripting is to query a
database for a list of input tuples to process, then pass them off to
another process to handle. they require a fair amount of
pre-processing before the other process can accept them, so i've
wrapped that in a shell function. the database tool is very bad at
reporting errors clearly (it has a tendency to print an error text to
stdout and then exit 0), so i was trying to figure out how to abort
the processing cleanly. my initial code looked something like

    set -e -o pipefail
    function query { result=`sql "$@"`; echo "$result"|grep -q error
&& return 1; echo "$result"; }
    query "query"|while read x y z; do handle $x $y $z; done

and i was sort of expecting the set -e, possibly in combination with
the pipefail, to cause the entire script to abort before ever calling
handle.

i guess my fundamental mistake (one i perpetually make whenever i get
back into serious shell scripting) was forgetting about the parallel
nature of pipes....

at the moment, BTW, i'm basically doing

    t=`mktemp`
    query "query" >$t
    while read x y z; do handle $x $y $z; done < $t
    rm $t

so that the errexit has a chance to see if something goes wrong with the query.
-- 
Aaron Davies
[email protected]
_______________________________________________
ast-users mailing list
[email protected]
https://mailman.research.att.com/mailman/listinfo/ast-users

Reply via email to