Hi,
I am currently running pyspark jobs against Spark 1.1.0 on YARN. When I run
example Java jobs such as spark-pi, the following files get created:
bash-4.1$ tree spark-pi-1420624364958
spark-pi-1420624364958
âââ APPLICATION_COMPLETE
âââ EVENT_LOG_1
âââ SPARK_VERSION_1.1.0
0 directories, 3
Thanks Andrew, simple fix ☺.
From: Andrew Ash [mailto:and...@andrewash.com]
Sent: 07 January 2015 15:26
To: England, Michael (IT/UK)
Cc: user
Subject: Re: FW: No APPLICATION_COMPLETE file created in history server log
location upon pyspark job success
Hi Michael,
I think you need
Hi Michael,
I think you need to explicitly call sc.stop() on the spark context for it
to close down properly (this doesn't happen automatically). See
https://issues.apache.org/jira/browse/SPARK-2972 for more details
Andrew
On Wed, Jan 7, 2015 at 3:38 AM, michael.engl...@nomura.com wrote: