[
https://issues.apache.org/jira/browse/SPARK-18898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Raphael updated SPARK-18898:
----------------------------
Description:
I am submitting my scala applications with SparkLauncher which uses
spark-submit project.
When I throw some error in my spark job, the final status of this job in YARN
is FINISHED, after viewing the source code of SparkSubmit:
{code:title=SparkSubmit.scala|borderStyle=solid}
...
try {
mainMethod.invoke(null, childArgs.toArray)
} catch {
case t: Throwable =>
findCause(t) match {
case SparkUserAppException(exitCode) =>
System.exit(exitCode)
case t: Throwable =>
throw t
}
}
...
{code}
Its seems we have in our code to throw the SparkUserAppException, but this
exception is a private case class inside the SparkException class.
Please make this case class avaliable or give us a way to launch errors inside
our applications.
More details at:
http://stackoverflow.com/questions/41184158/how-to-throw-an-exception-in-spark
In the past, the same issue with pyspark was opened here:
https://issues.apache.org/jira/browse/SPARK-7736
And resolved here:
https://github.com/apache/spark/pull/8258
Best Regards
Raphael.
was:
I am submitting my scala applications with SparkLauncher which uses
spark-submit project.
When I throw some error in my spark job, the final status of this job in YARN
is FINISHED, after viewing the source code of SparkSubmit:
{code:title=SparkSubmit.scala|borderStyle=solid}
...
try {
mainMethod.invoke(null, childArgs.toArray)
} catch {
case t: Throwable =>
findCause(t) match {
case SparkUserAppException(exitCode) =>
System.exit(exitCode)
case t: Throwable =>
throw t
}
}
...
{code}
Its seems we have in our code to throw the SparkUserAppException but this
exception is a private case class inside the SparkException class.
More details at:
http://stackoverflow.com/questions/41184158/how-to-throw-an-exception-in-spark
In the past, the same issue with pyspark was opened here:
https://issues.apache.org/jira/browse/SPARK-7736
And resolved here:
https://github.com/apache/spark/pull/8258
> Exception not failing Scala applications (in yarn)
> --------------------------------------------------
>
> Key: SPARK-18898
> URL: https://issues.apache.org/jira/browse/SPARK-18898
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit, YARN
> Affects Versions: 2.0.2
> Reporter: Raphael
>
> I am submitting my scala applications with SparkLauncher which uses
> spark-submit project.
> When I throw some error in my spark job, the final status of this job in YARN
> is FINISHED, after viewing the source code of SparkSubmit:
> {code:title=SparkSubmit.scala|borderStyle=solid}
> ...
> try {
> mainMethod.invoke(null, childArgs.toArray)
> } catch {
> case t: Throwable =>
> findCause(t) match {
> case SparkUserAppException(exitCode) =>
> System.exit(exitCode)
> case t: Throwable =>
> throw t
> }
> }
> ...
> {code}
>
> Its seems we have in our code to throw the SparkUserAppException, but this
> exception is a private case class inside the SparkException class.
> Please make this case class avaliable or give us a way to launch errors
> inside our applications.
> More details at:
> http://stackoverflow.com/questions/41184158/how-to-throw-an-exception-in-spark
> In the past, the same issue with pyspark was opened here:
> https://issues.apache.org/jira/browse/SPARK-7736
> And resolved here:
> https://github.com/apache/spark/pull/8258
> Best Regards
> Raphael.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]