Did you try in standalone mode. You may not see serialization issues in
local threaded mode.
Serialization errors are unlikely to be cause of Mapr hadoop version.
Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Mon, May 26, 2014 at 3:18 PM, nelson <nelson.verd...@ysance.com> wrote:

> Hi all,
>
> I am trying to run spark over a MapR cluster. I successfully ran several
> custom applications on a previous non-mapr hadoop cluster but i can't get
> them working on the mapr one. To be more specific, i am not able to read or
> write on mfs without running into a serialization error from Java. Note
> that
> everything works fine when i am running the app in local mode which make me
> think of a dependancy error.
>
> The test application is built using sbt with the following dependency:
>  - org.apache.spark spark-core 0.9.1
>
> In my test_app/lib directory i have:
>  - hadoop-core-1.0.3-mapr-3.0.2.jar
>  - json-20140107.jar
>  - maprfs-1.0.3-mapr-3.0.2.jar
>
> Finally, i add those jars with conf.setJars so that they are distributed on
> the cluster.
>
> Am I compiling with the wrong dependencies? Should i get a "mapr version"
> of
> spark-core?
>
> Regards, Nelson
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/maprfs-and-spark-libraries-tp6392.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to