[
https://issues.apache.org/jira/browse/SPARK-12219?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15049044#comment-15049044
]
Rodrigo Boavida commented on SPARK-12219:
-----------------------------------------
Hi Sean,
I've proceed as instructed per documentation. Applied the following script
<scala.version>2.11.7</scala.version>
Also changed the Scala version in the main pom.xml to 2.11.7 which it is not
done by the script
<scala.version>2.11.7</scala.version>
The errors I get when I run: "build/sbt -Pyarn -Phadoop-2.3 -Dscala-2.11
assembly"
[error]
/home/spark/sbt_spark-1.5.2/core/src/main/scala/org/apache/spark/rdd/UnionRDD.scala:40:
no valid targets for annotation on value rdd - it is discarded unused. You may
specify targets with meta-annotations, e.g. @(transient @param)
[error] @transient rdd: RDD[T],
[error]
[error]
/home/spark/sbt_spark-1.5.2/core/src/main/scala/org/apache/spark/rdd/UnionRDD.scala:42:
no valid targets for annotation on value parentRddPartitionIndex - it is
discarded unused. You may specify targets with meta-annotations, e.g.
@(transient @param)
[error] @transient parentRddPartitionIndex: Int)
[error]
[error]
/home/spark/sbt_spark-1.5.2/core/src/main/scala/org/apache/spark/rdd/PartitionPruningRDD.scala:35:
no valid targets for annotation on value partitionFilterFunc - it is discarded
unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[error] private[spark] class PruneDependency[T](rdd: RDD[T], @transient
partitionFilterFunc: Int => Boolean)
[error]
[error]
/home/spark/sbt_spark-1.5.2/core/src/main/scala/org/apache/spark/rdd/PartitionPruningRDD.scala:58:
no valid targets for annotation on value prev - it is discarded unused. You
may specify targets with meta-annotations, e.g. @(transient @param)
[error] @transient prev: RDD[T],
[error]
[error]
/home/spark/sbt_spark-1.5.2/core/src/main/scala/org/apache/spark/rdd/PartitionPruningRDD.scala:59:
no valid targets for annotation on value partitionFilterFunc - it is discarded
unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[error] @transient partitionFilterFunc: Int => Boolean)
> Spark 1.5.2 code does not build on Scala 2.11.7 with SBT assembly
> -----------------------------------------------------------------
>
> Key: SPARK-12219
> URL: https://issues.apache.org/jira/browse/SPARK-12219
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 1.5.2
> Reporter: Rodrigo Boavida
>
> I've tried with no success to build Spark on Scala 2.11.7. I'm getting build
> errors using sbt due to the issues found in the below thread in July of this
> year.
> https://mail-archives.apache.org/mod_mbox/spark-dev/201507.mbox/%3CCA+3qhFSJGmZToGmBU1=ivy7kr6eb7k8t6dpz+ibkstihryw...@mail.gmail.com%3E
> Seems some minor fixes are needed to make the Scala 2.11 compiler happy.
> I needed to build with SBT as per suggested on below thread to get over some
> apparent maven shader plugin because which changed some classes when I change
> to akka 2.4.0.
> https://groups.google.com/forum/#!topic/akka-user/iai6whR6-xU
> I've set this bug to Major priority assuming that the Spark community wants
> to keep fully supporting SBT builds, including the Scala 2.11 compatibility.
> Tnks,
> Rod
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]