Cool that's exactly what I was looking for! Thanks!

How does one output the status into stdout? I mean, how does one capture
the status output of pipe() command?

On Sat, Feb 11, 2017 at 9:50 AM, Felix Cheung <felixcheun...@hotmail.com>
wrote:

> Do you want the job to fail if there is an error exit code?
>
> You could set checkCode to True
> spark.apache.org/docs/latest/api/python/pyspark.html?
> highlight=pipe#pyspark.RDD.pipe
>
> Otherwise maybe you want to output the status into stdout so you could
> process it individually.
>
>
> _____________________________
> From: Xuchen Yao <yaoxuc...@gmail.com>
> Sent: Friday, February 10, 2017 11:18 AM
> Subject: Getting exit code of pipe()
> To: <user@spark.apache.org>
>
>
>
> Hello Community,
>
> I have the following Python code that calls an external command:
>
> rdd.pipe('run.sh', env=os.environ).collect()
>
> run.sh can either exit with status 1 or 0, how could I get the exit code
> from Python? Thanks!
>
> Xuchen
>
>
>

Reply via email to