[
https://issues.apache.org/jira/browse/SPARK-4492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14219347#comment-14219347
]
sam commented on SPARK-4492:
----------------------------
// OK, it may then be that you package some of spark, but not the YARN module.//
If I "CMD + F" on the (well hidden) list of maven cords on the cloudera website
[1] for "yarn" I get 133 results ... please could you say which one? My Spark
deps are "1.0.0-cdh5.1.3"
// You can package Spark too but then you have trouble matching versions
exactly.//
There should really be a versioning handshake between the driver and master ...
it once took me 5 LOCs in C with RPC, so should take half a LOC in Scala :)
I.e. if you package with 1.0.0-cdh5.1.3 but the cluster has 1.1.0-cdh5.2.0 then
you get an error saying:
"you packaged with 1.0.0-cdh5.1.3 but the cluster has 1.1.0-cdh5.2.0, change
your dep cords to use 1.1.0-cdh5.2.0"
I've written some amazing Machine Learning algorithms in Spark than really do
run lightening fast, but every single time I move to a new cluster I spend MORE
time working out how to build it and run it than I do writing the algorithms.
Have Cloudera/DataBricks thought about providing a github of template SBT files?
(1)
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_vd_cdh5_maven_repo.html
> Exception when following SimpleApp tutorial java.lang.ClassNotFoundException:
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
> ------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-4492
> URL: https://issues.apache.org/jira/browse/SPARK-4492
> Project: Spark
> Issue Type: Bug
> Reporter: sam
>
> When I follow the example here
> https://spark.apache.org/docs/1.0.2/quick-start.html and run with "java -cp
> my.jar my.main.Class" with master set to "yarn-client" I get the below
> exception.
> Exception in thread "main" java.lang.ExceptionInInitializerError
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
> at com.barclays.SimpleApp$.main(SimpleApp.scala:11)
> at com.barclays.SimpleApp.main(SimpleApp.scala)
> Caused by: org.apache.spark.SparkException: Unable to load YARN support
> at
> org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:106)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:101)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
> ... 3 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:169)
> at
> org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:102)
> ... 5 more
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]