IMHO, cleaning and building might help.
To do so you can do `sbt/sbt clean` on the command line. Additionally on
linux you can
(I prefer doing this)    find -name "target" -type d -exec rm -r {} +



On Thu, Nov 28, 2013 at 8:48 PM, Nathan Kronenfeld <
[email protected]> wrote:

> Hi, folks.
>
> I'm trying to build the a spark distribution with the latest code.
>
> I started out this morning with:
>
> ./make-distribution.sh
>
>
> and that worked fine. But then I realized I'd forgotten to set the hadoop
> version I needed, so I redid it with
>
> ./make-distribution.sh --hadoop 2.0.0-cdh4.4.0
>
>
> That failed with a whole bunch of error messages (43 to be exact) in
> streaming on the lines of:
>
> ...streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:51:
> type mismatch
> found: org.apache.spark.streaming..DStream[(K, V)]
> expected: org.apache.spark.streaming.api.java.JavaPairDStream[K, V]
> Note: implicit method fromPairDStream is not applicable here because it
> comes after the application point and it lacks an explicit return type.
> dstream.filter(x => f(x).booleanValue())
>
>
> (42 more like that in different places).  So I went back and tried
>
> ./make-distribution.sh
>
>
> again - now it failed with the same errors, though it just worked a second
> ago.  Clean up the dist directory - same thing  Log out and in to reset my
> environment - same thing.
>
> So though it built fine once, now it refuses to build again.
>
> Does anyone have a clue what is going on here?
>
> Any help very much appreciated,
>                 -Nathan
>
>
> --
> Nathan Kronenfeld
> Senior Visualization Developer
> Oculus Info Inc
> 2 Berkeley Street, Suite 600,
> Toronto, Ontario M5A 4J5
> Phone:  +1-416-203-3003 x 238
> Email:  [email protected]
>



-- 
s

Reply via email to