That's pretty unusual; normally the executor's stderr output would
contain a stacktrace and any other error messages from your Python
code. Is it possible that the PySpark worker crashed in C code or was
OOM killed?

On Thu, Dec 19, 2013 at 11:10 AM, Sandy Ryza <[email protected]> wrote:
> Hey All,
>
> Where are python logs in PySpark supposed to go?  My job is getting a
> org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
> but when I look at the stdout/stderr logs in the web UI, nothing interesting
> shows up (stdout is empty and stderr just has the spark executor command).
>
> Is this the expected behavior?
>
> thanks in advance for any guidance,
> Sandy

Reply via email to