[
https://issues.apache.org/jira/browse/SPARK-14247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15216672#comment-15216672
]
Sean Owen commented on SPARK-14247:
-----------------------------------
hadoop-core is the old artifact name from pre-YARN Hadoop 1 days. CDH still
publishes a build that works with these different sets of artifacts -- commonly
called the "MapReduce 1" artifacts. You want to refer to hadoop-common instead
and use the artifacts without "-mr1-". This alone may solve it, since this is
actually a build set up to support an old set of dependencies.
> Spark does not compile with CDH-5.4.x due to the possible bug of ivy.....
> -------------------------------------------------------------------------
>
> Key: SPARK-14247
> URL: https://issues.apache.org/jira/browse/SPARK-14247
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 1.6.0
> Reporter: Nan Zhu
> Priority: Minor
>
> I recently tried to compile Spark with CDH 5.4.x by the following command:
> sbt -Phadoop-2.6 -Dhadoop.version=2.6.0-mr1-cdh5.4.5 -DskipTests assembly
> I cannot finish the building due to an error saying that [error] impossible
> to get artifacts when data has not been loaded. IvyNode =
> org.slf4j#slf4j-api;1.6.1
> It seems that CDH depends on slf4j 1.6.x while Spark has upgraded to 1.7.x
> long time ago and during the compilation, slf4j 1.6.x is evicted unexpectedly
> (see the related discussion in https://github.com/sbt/sbt/issues/1598)
> I currently work around this by downgrade to slf4j 1.6.1....
> What surprises me is that I was upgrading from Spark 1.5.0 to 1.6.x, with
> 1.5.0, I can successfully compile Spark with the same version of CDH....
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]