[ 
https://issues.apache.org/jira/browse/SPARK-3452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14267469#comment-14267469
 ] 

Aniket Bhatnagar commented on SPARK-3452:
-----------------------------------------

Here is the exception I am getting while triggering a job that contains 
SparkContext having master as yarn-client. A quick look at 1.2.0 source code 
suggests I should depend on spark-yarn module which I can't as it is not longer 
published. Do you want me to log a separate defect for this and submit 
appropriate pull request? 

2015-01-07 14:39:22,799 [pool-10-thread-13] [info] o.a.s.s.MemoryStore - MemoryS
tore started with capacity 731.7 MB
Exception in thread "pool-10-thread-13" java.lang.ExceptionInInitializerError
        at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
        at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
        at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
        at com.myimpl.Server:23)
        at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
        at scala.util.Try$.apply(Try.scala:191)
        at scala.util.Success.map(Try.scala:236)
        at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
        at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
        at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
        at scala.util.Try$.apply(Try.scala:191)
        at scala.util.Success.map(Try.scala:236)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Unable to load YARN support
        at 
org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:199)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:194)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
        ... 27 more
Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:190)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:195)
        ... 29 more


> Maven build should skip publishing artifacts people shouldn't depend on
> -----------------------------------------------------------------------
>
>                 Key: SPARK-3452
>                 URL: https://issues.apache.org/jira/browse/SPARK-3452
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.0.0, 1.1.0
>            Reporter: Patrick Wendell
>            Assignee: Prashant Sharma
>            Priority: Critical
>             Fix For: 1.2.0
>
>
> I think it's easy to do this by just adding a skip configuration somewhere. 
> We shouldn't be publishing repl, yarn, assembly, tools, repl-bin, or examples.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to