Junfan Zhang created OOZIE-3404:
-----------------------------------
Summary: The env variable of SPARK_HOME needs to be set when
running pySpark
Key: OOZIE-3404
URL: https://issues.apache.org/jira/browse/OOZIE-3404
Project: Oozie
Issue Type: Bug
Reporter: Junfan Zhang
Assignee: Junfan Zhang
When we run spark in a cluster, we rely on the spark jars on hdfs. We don't
deploy Spark on the cluster server. So running pySpark according to the Oozie
documentation is not successful.
Currently I have added the {{SPARK_HOME}} class environment variable to
{{sparkMain}} class and it has been able to run successfully.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)