Did you do sbt publish-local before by any chance? Maybe it’s picking up 
another build of Spark from somewhere. In that case delete that build from your 
local .m2 and .ivy2 directories.

Matei

On Jan 22, 2014, at 12:14 PM, Manoj Samel <[email protected]> wrote:

> Did this on a fresh install - still same error.
> 
> 
> On Wed, Jan 22, 2014 at 10:46 AM, Matei Zaharia <[email protected]> 
> wrote:
> Try doing a sbt clean before rebuilding.
> 
> Matei
> 
> On Jan 22, 2014, at 10:22 AM, Manoj Samel <[email protected]> wrote:
> 
>> See thread below. Reposted as compilation error thread
>> 
>> ---------- Forwarded message ----------
>> From: Manoj Samel <[email protected]>
>> Date: Wed, Jan 22, 2014 at 10:20 AM
>> Subject: Re: make-distribution.sh error 
>> org.apache.hadoop#hadoop-client;2.0.0: not found
>> To: [email protected]
>> 
>> 
>> I think I found the way to go beyond error "unresolved dependency: 
>> org.apache.hadoop#hadoop-client;2.0.0: not found"
>> 
>> On my machine, the command "hadoop version" gives "2.0.0-cdh4.5.0" , not 
>> "2.0.0".
>> 
>> After I changed the SPARK_HADOOP_VERSION to 2.0.0-cdh4.5.0, it is proceeding 
>> beyond that point.
>> 
>> However, now it fails with multiple compilation errors in 
>> /.../streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala
>> 
>> [error] 
>> /data/spark/spark-0.8.1-incubating/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:605:
>>  type mismatch;
>> [error]  found   : org.apache.spark.streaming.DStream[(K, (V, 
>> com.google.common.base.Optional[W]))]
>> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,(V, 
>> com.google.common.base.Optional[W])]
>> [error]  Note: implicit method fromPairDStream is not applicable here 
>> because it comes after the application point and it lacks an explicit result 
>> type
>> [error]     joinResult.mapValues{case (v, w) => (v, 
>> JavaUtils.optionToOptional(w))}
>> [error]                         ^
>> [error] 
>> /data/spark/spark-0.8.1-incubating/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:617:
>>  type mismatch;
>> [error]  found   : org.apache.spark.streaming.DStream[(K, 
>> (com.google.common.base.Optional[V], W))]
>> [error]  required: 
>> org.apache.spark.streaming.api.java.JavaPairDStream[K,(com.google.common.base.Optional[V],
>>  W)]
>> [error]  Note: implicit method fromPairDStream is not applicable here 
>> because it comes after the application point and it lacks an explicit result 
>> type
>> [error]     joinResult.mapValues{case (v, w) => 
>> (JavaUtils.optionToOptional(v), w)}
>> [error]                         ^
>> [error] 
>> /data/spark/spark-0.8.1-incubating/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:632:
>>  type mismatch;
>> [error]  found   : org.apache.spark.streaming.DStream[(K, 
>> (com.google.common.base.Optional[V], W))]
>> [error]  required: 
>> org.apache.spark.streaming.api.java.JavaPairDStream[K,(com.google.common.base.Optional[V],
>>  W)]
>> [error]  Note: implicit method fromPairDStream is not applicable here 
>> because it comes after the application point and it lacks an explicit result 
>> type
>> [error]     joinResult.mapValues{case (v, w) => 
>> (JavaUtils.optionToOptional(v), w)}
>> [error]                         ^
>> [error] 
>> /data/spark/spark-0.8.1-incubating/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:647:
>>  type mismatch;
>> [error]  found   : org.apache.spark.streaming.DStream[(K, 
>> (com.google.common.base.Optional[V], W))]
>> [error]  required: 
>> org.apache.spark.streaming.api.java.JavaPairDStream[K,(com.google.common.base.Optional[V],
>>  W)]
>> [error]  Note: implicit method fromPairDStream is not applicable here 
>> because it comes after the application point and it lacks an explicit result 
>> type
>> [error]     joinResult.mapValues{case (v, w) => 
>> (JavaUtils.optionToOptional(v), w)}
>> [error]                         ^
>> [error] 43 errors found
>> [error] (streaming/compile:compile) Compilation failed
>> 
>> 
>> 
>> 
>> On Wed, Jan 22, 2014 at 10:09 AM, Manoj Samel <[email protected]> 
>> wrote:
>> Hi,
>> 
>> On my cluster I have CDH 4.5 (Hadoop 2.0.0) installed. I installed spark 
>> 0.8.1 and tried to run make-distribution.sh after setting 
>> SPARK_HADOOP_VERSION=2.0.0
>> 
>> I get 
>> 
>> [warn]   
>> http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/2.0.0/hadoop-client-2.0.0.pom
>> [info] Resolving org.apache.derby#derby;10.4.2.0 ...
>> [warn]       ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]       ::          UNRESOLVED DEPENDENCIES         ::
>> [warn]       ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]       :: org.apache.hadoop#hadoop-client;2.0.0: not found
>> [warn]       ::::::::::::::::::::::::::::::::::::::::::::::
>> sbt.ResolveException: unresolved dependency: 
>> org.apache.hadoop#hadoop-client;2.0.0: not found
>>      at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:214)
>>      at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:122)
>>      at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:121)
>>      at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:117)
>>      at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:117)
>> 
>> Any thoughts?
>> 
>> 
> 
> 

Reply via email to