[
https://issues.apache.org/jira/browse/SPARK-4492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14220282#comment-14220282
]
Sean Owen commented on SPARK-4492:
----------------------------------
As I say I think you can actually embed Spark. I'm doing it now, and I think
it's a legit use case. The scripts take care of a lot of corner-case stuff that
may be irrelevant, in which case they are indeed not required. I don't think
embedding it is formally supported or documented, but it is designed sensibly
in a way that you can actually do it with a little know-how.
Yes, the point of not packaging dependencies is to avoid all of the version
matching headache you are talking about. You don't have to worry about any of
that then.
Have you checked out the quick start? SBT apps are covered there.
http://spark.apache.org/docs/latest/quick-start.html
> Exception when following SimpleApp tutorial java.lang.ClassNotFoundException:
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
> ------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-4492
> URL: https://issues.apache.org/jira/browse/SPARK-4492
> Project: Spark
> Issue Type: Bug
> Reporter: sam
>
> When I follow the example here
> https://spark.apache.org/docs/1.0.2/quick-start.html and run with "java -cp
> my.jar my.main.Class" with master set to "yarn-client" I get the below
> exception.
> Exception in thread "main" java.lang.ExceptionInInitializerError
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
> at com.barclays.SimpleApp$.main(SimpleApp.scala:11)
> at com.barclays.SimpleApp.main(SimpleApp.scala)
> Caused by: org.apache.spark.SparkException: Unable to load YARN support
> at
> org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:106)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:101)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
> ... 3 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:169)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:102)
> ... 5 more
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]