Hi Kevin,

Have you tried to execute your spark application locally without Oozie and
it worked? I believe so.

Looking at the error code and SparkSubmit, it turns out 101 means class not
found, see
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L87
.

I saw you mentioned external jars in your spark application (parquet
related).
- Prior to execution, try to upload dependency jar files to a directory on
HDFS an refer to that directory in *job.properties* using
oozie.libpath=${nameNode}/path/to/DEPENDENCY_JARS_DIR/

To hunt down this kind of issues, the following could also help:
- pass -verbose:class to your driver (e.g. --conf
spark.driver.extraJavaOptions="-verbose:class", --conf spark.executor.
extraJavaOptions="-verbose:class")

Hope this helps,
- Attila

This might be related also
https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/ClassNotFoundException-org-apache-htrace-Trace-exception-in/m-p/29253/highlight/true#M915
?


On Wed, Aug 31, 2016 at 10:50 PM, Peter Cseh <[email protected]> wrote:

> Hi,
> Could you find the logs for the Spark job that had been running?
> This log has only this in info:
>
> >>> Invoking Spark class now >>>
>
> Intercepting System.exit(101)
>
> <<< Invocation of Main class completed <<<
>
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
>
> Oozie Launcher failed, finishing Hadoop job gracefully
>
> Which is not that useful.
>
> Thank you,
>
> Gp
>
>
> On Wed, Aug 31, 2016 at 10:46 PM, Kevin Peng <[email protected]> wrote:
>
> > Satish,
> >
> > I have attached the entire logs (which I believe has the stack trace in
> > this email).
> >
> > Thanks,
> >
> > KP
> >
> > On Wed, Aug 31, 2016 at 12:48 PM, satish saley <[email protected]>
> > wrote:
> >
> >> Hi,
> >> Can you share the stacktrace you see in the actual spark job (the child
> >> job)?
> >>
> >> On Wed, Aug 31, 2016 at 10:40 AM, Kevin Peng <[email protected]> wrote:
> >>
> >> > Hi All,
> >> >
> >> > I am trying to get Spark Action to work in oozie.  I am currently
> using
> >> CDH
> >> > 5.7.2 and Hue 3.9 that is bundled with the CDH version to create a
> test
> >> > workflow.  I create a spark jar that had all it's needed dependencies
> >> > compiled in the jar.  All this jar does is read a parquet file from
> hdfs
> >> > and write back to a different directory in hdfs.  When running this
> >> > workflow I am getting a 101 error:
> >> >
> >> > >>> Invoking Spark class now >>>
> >> >
> >> >
> >> > Intercepting System.exit(101)
> >> >
> >> >
> >> > <<< Invocation of Main class completed <<<
> >> >
> >> >
> >> > Failing Oozie Launcher, Main class
> >> > [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
> >> >
> >> >
> >> > Oozie Launcher failed, finishing Hadoop job gracefully
> >> >
> >> >
> >> > Oozie Launcher, uploading action data to HDFS sequence file:
> >> > hdfs://nameservice1/user/kpeng/oozie-oozi/0002825-
> >> > 160817212356434-oozie-oozi-W/spark-b023--spark/action-data.seq
> >> >
> >> >
> >> > Oozie Launcher ends
> >> >
> >> >
> >> > Any help would be appreciated.
> >> >
> >> >
> >> > Thanks,
> >> >
> >> >
> >> > KP
> >> >
> >>
> >
> >
>
>
> --
> Peter Cseh
> Software Engineer
> <http://www.cloudera.com>
>

Reply via email to