[ 
https://issues.apache.org/jira/browse/SPARK-1835?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14047070#comment-14047070
 ] 

Stephen Boesch commented on SPARK-1835:
---------------------------------------


I followed the directions in the bug - to delete the mesos-18.1.jar   But the 
following errors now happen. Note this error has also been reported in the 
spark dev mailing list - so the following is just corroborating what others 
have already noted.

Error:scalac: 
     while compiling: 
C:\apps\incubator-spark\sql\core\src\main\scala\org\apache\spark\sql\test\TestSQLContext.scala
        during phase: jvm
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args: -classpath <long classpath . -bootclasspath 
C:\apps\jdk1.7.0_51\jre\lib\resources.jar;C:\apps\jdk1.7.0_51\jre\lib\rt.jar;C:\apps\jdk1.7.0_51\jre\lib\sunrsasign.jar;C:\apps\jdk1.7.0_51\jre\lib\jsse.jar;C:\apps\jdk1.7.0_51\jre\lib\jce.jar;C:\apps\jdk1.7.0_51\jre\lib\charsets.jar;C:\apps\jdk1.7.0_51\jre\lib\jfr.jar;C:\apps\jdk1.7.0_51\jre\classes;C:\Users\s80035683\.m2\repository\org\scala-lang\scala-library\2.10.4\scala-library-2.10.4.jar
 -deprecation -feature -unchecked -language:postfixOps
  last tree to typer: 
Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
              symbol: null
   symbol definition: null
                 tpe: 
Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
       symbol owners: 
      context owners: object TestSQLContext -> package test
== Enclosing template or block ==
Template( // val <local TestSQLContext>: <notype> in object TestSQLContext, 
tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
  "org.apache.spark.sql.SQLContext" // parents
  ValDef(
    private
    "_"
    <tpt>
    <empty>
  )
  // 2 statements
  DefDef( // private def readResolve(): Object in object TestSQLContext
    <method> private <synthetic>
    "readResolve"
    []
    List(Nil)
    <tpt> // tree.tpe=Object
    test.this."TestSQLContext" // object TestSQLContext in package test, 
tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
  )
  DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type in 
object TestSQLContext
    <method>
    "<init>"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
    Block( // tree.tpe=Unit
      Apply( // def <init>(sparkContext: org.apache.spark.SparkContext): 
org.apache.spark.sql.SQLContext in class SQLContext, 
tree.tpe=org.apache.spark.sql.SQLContext
        TestSQLContext.super."<init>" // def <init>(sparkContext: 
org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class 
SQLContext, tree.tpe=(sparkContext: 
org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
        Apply( // def <init>(master: String,appName: String,conf: 
org.apache.spark.SparkConf): org.apache.spark.SparkContext in class 
SparkContext, tree.tpe=org.apache.spark.SparkContext
          new org.apache.spark.SparkContext."<init>" // def <init>(master: 
String,appName: String,conf: org.apache.spark.SparkConf): 
org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master: String, 
appName: String, conf: org.apache.spark.SparkConf)org.apache.spark.SparkContext
          // 3 arguments
          "local"
          "TestSQLContext"
          Apply( // def <init>(): org.apache.spark.SparkConf in class 
SparkConf, tree.tpe=org.apache.spark.SparkConf
            new org.apache.spark.SparkConf."<init>" // def <init>(): 
org.apache.spark.SparkConf in class SparkConf, 
tree.tpe=()org.apache.spark.SparkConf
            Nil
          )
        )
      )
      ()
    )
  )
)
== Expanded type of tree ==
ConstantType(
  value = Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
)
uncaught exception during compilation: java.lang.AssertionError

> sbt gen-idea includes both mesos and mesos with shaded-protobuf into 
> dependencies
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-1835
>                 URL: https://issues.apache.org/jira/browse/SPARK-1835
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Xiangrui Meng
>            Priority: Minor
>
> gen-idea includes both mesos-0.18.1 and mesos-0.18.1-shaded-protobuf into 
> dependencies. This generates compile error because mesos-0.18.1 comes first 
> and there is no protobuf jar in the dependencies.
> A workaround is to delete mesos-0.18.1.jar manually from idea intellij. 
> Another solution is to publish the shaded jar as a separate version instead 
> of using classifier.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to