jdk: 1.8.0_77
scala: 2.10.4
mvn: 3.3.9.

Slightly changed the pom.xml:
$ diff pom.xml pom.original 
130c130
<     <hadoop.version>2.6.0-cdh5.7.0-SNAPSHOT</hadoop.version>
---
>     <hadoop.version>2.2.0</hadoop.version>
133c133
<     <hbase.version>1.2.0-cdh5.7.0-SNAPSHOT</hbase.version>
---
>     <hbase.version>0.98.7-hadoop2</hbase.version>


command: build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.6.0
-DskipTests clean package

error: 
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-core_2.10 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 21 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-core_2.10 ---
[INFO] Using zinc server for incremental compilation
[info] Compiling 486 Scala sources and 76 Java sources to
/home/jfeng/workspace/spark-1.6.0/core/target/scala-2.10/classes...
[error]
/home/jfeng/workspace/spark-1.6.0/core/src/main/scala/org/apache/spark/TestUtils.scala:22:
object StandardCharsets is not a member of package java.nio.charset
[error] import java.nio.charset.StandardCharsets
[error]        ^
[error]
/home/jfeng/workspace/spark-1.6.0/core/src/main/scala/org/apache/spark/TestUtils.scala:23:
object file is not a member of package java.nio
[error] import java.nio.file.Paths
[error]                 ^
[error]
/home/jfeng/workspace/spark-1.6.0/core/src/main/scala/org/apache/spark/TestUtils.scala:80:
not found: value StandardCharsets
[error]       ByteStreams.copy(new
ByteArrayInputStream(v.getBytes(StandardCharsets.UTF_8)), jarStream)
[error]                                                            ^
[error]
/home/jfeng/workspace/spark-1.6.0/core/src/main/scala/org/apache/spark/TestUtils.scala:95:
not found: value Paths
[error]       val jarEntry = new
JarEntry(Paths.get(directoryPrefix.getOrElse(""), file.getName).toString)
[error]                                   ^
[error]
/home/jfeng/workspace/spark-1.6.0/core/src/main/scala/org/apache/spark/launcher/LauncherBackend.scala:43:
value getLoopbackAddress is not a member of object java.net.InetAddress
[error]       val s = new Socket(InetAddress.getLoopbackAddress(), port.get)
[error] 

Thanks in advance.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/build-spark-1-6-against-cdh5-7-with-hadoop-2-6-0-hbase-1-2-Failure-tp26762.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to