Hi, Thanks for your reply ! sbt/sbt clean does not help.
I did the following in incubator-spark directory and still get the same error as before. 1) sbt/sbt clean 2) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly 3) sbt/sbt compile publish-local Shing On Sunday, January 12, 2014 12:32 AM, Patrick Wendell <[email protected]> wrote: Can you try running "sbt/sbt clean". Sometimes things can get randomly corrupted and cause stuff like this. On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <[email protected]> wrote: > > > > > Hi, > I have checkouted the development version of Spark at > git://github.com/apache/incubator-spark.git. > > I have trying to compile it with Scala 2.10.3. > > The following command completed successfully. > > matmsh@gauss:~/Downloads/spark/github/incubator-spark> > SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly > But > > matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile > publish-local > > gives the following error: > > > > [info] Compiling 1 Scala source to > /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes... > [info] Compiling 8 Scala sources to > /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes... > [error] > /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: > type mismatch; > [error] found : org.apache.spark.streaming.DStream[(K, V)] > [error] required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V] > [error] Note: implicit method fromPairDStream is not applicable here because > it comes after the application point and it lacks an explicit result type > [error] dstream.filter((x => f(x).booleanValue())) > > > Any there anyway to resolve the above issue ? > > Thanks in advance for your assistance ! > > > Shing
