(MissingRequirementError.scala:17)
……
From: Sean Owen [mailto:so...@cloudera.com]
Sent: Saturday, December 20, 2014 8:12 AM
To: Haopu Wang
Cc: user@spark.apache.org; Raghavendra Pandey
Subject: RE: Can Spark 1.1.0 save checkpoint to HDFS 2.5.1?
That's exactl
ound the issue? Thanks for any suggestions.
>
>
> --
>
> *From:* Raghavendra Pandey [mailto:raghavendra.pan...@gmail.com]
> *Sent:* Saturday, December 20, 2014 12:08 AM
> *To:* Sean Owen; Haopu Wang
> *Cc:* user@spark.apache.org
> *Subject:* Re: Can Spark 1.1.0 save checkpoint to HDFS 2.
On Fri, Dec 19, 2014 at 4:05 PM, Haopu Wang wrote:
> My application doesn’t depends on hadoop-client directly.
>
> It only depends on spark-core_2.10 which depends on hadoop-client 1.0.4.
> This can be checked by Maven repository at
> http://mvnrepository.com/artifact/org.apache.spark/spark-core_2
issue? Thanks for any suggestions.
From: Raghavendra Pandey [mailto:raghavendra.pan...@gmail.com]
Sent: Saturday, December 20, 2014 12:08 AM
To: Sean Owen; Haopu Wang
Cc: user@spark.apache.org
Subject: Re: Can Spark 1.1.0 save checkpoint to HDFS 2.5.1?
It
It seems there is hadoop 1 somewhere in the path.
On Fri, Dec 19, 2014, 21:24 Sean Owen wrote:
> Yes, but your error indicates that your application is actually using
> Hadoop 1.x of some kind. Check your dependencies, especially
> hadoop-client.
>
> On Fri, Dec 19, 2014 at 2:11 PM, Haopu Wang
Yes, but your error indicates that your application is actually using
Hadoop 1.x of some kind. Check your dependencies, especially
hadoop-client.
On Fri, Dec 19, 2014 at 2:11 PM, Haopu Wang wrote:
> I’m using Spark 1.1.0 built for HDFS 2.4.
>
> My application enables check-point (to HDFS 2.5.1) a
I’m using Spark 1.1.0 built for HDFS 2.4.
My application enables check-point (to HDFS 2.5.1) and it can build. But when I
run it, I get below error:
Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC
version 9 cannot communicate with client version 4
at org.apa