Hi Arun, Have you read https://www.gnu.org/software/parallel/parallel_tutorial.html#Saving-output-into-files ? I'm not sure how you would collect one of those file handles (e.g., stderr) onto one terminal for monitoring, but maybe moreutils' `sponge` program would help. You might also want to try --ungroup. Should EXIT_VALUE control whether the job succeeded or failed?
Joe On Mon, Nov 6, 2017 at 8:13 PM, Arun Vimalathithen <[email protected]> wrote: > Hi, > > I have a shell script(s) (generated) that does something calls a bunch > of curl commands in a loop and accumulates the HTTP response values to > decide the exit value. Something similar to:- > > EXEC 3>&1 > EXIT_VALUE=0 > curl="curl XXXXXXXX" > HTTP_STATUS=$curl | parse and get HTTP status > MESSAGE="The cmd $curl executed with status $HTTP_STATUS" > > if [ $HTTP_STATUS -gt 204] > EXIT_VALUE=1 > fi > > echo $MESSAGE >&3 > echo EXIT_VALUE > > I need the contents of $MESSAGE to be displayed while the script is > running as well (for the users to troubleshoot, etc) but I have lost > then when I converted the script to be run through GNU parallel. > > Can someone please give me any pointers on where I am going wrong? > > Thanks, > > Arun >
