sorry typo, I meant *without* the addJar
On 14 December 2015 at 11:13, Jakob Odersky wrote:
> > Sorry,I'm late.I try again and again ,now I use spark 1.4.0 ,hadoop
> 2.4.1.but I also find something strange like this :
>
> >
> http://apache-spark-user-list.1001560.n3.nabble.com/worker-java-la
> Sorry,I'm late.I try again and again ,now I use spark 1.4.0 ,hadoop
2.4.1.but I also find something strange like this :
>
http://apache-spark-user-list.1001560.n3.nabble.com/worker-java-lang-ClassNotFoundException-ttt-test-anonfun-1-td25696.html
> (if i use "textFile",It can't run.)
In the
Btw, Spark 1.5 comes with support for hadoop 2.2 by default
On 11 December 2015 at 03:08, Bonsen wrote:
> Thank you,and I find the problem is my package is test,but I write package
> org.apache.spark.examples ,and IDEA had imported the
> spark-examples-1.5.2-hadoop2.6.0.jar ,so I can run it,and
It looks like you have an issue with your classpath, I think it is because
you add a jar containing Spark twice: first, you have a dependency on Spark
somewhere in your build tool (this allows you to compile and run your
application), second you re-add Spark here
> sc.addJar("/home/hadoop/spark-a
Could you provide some more context? What is rawData?
On 10 December 2015 at 06:38, Bonsen wrote:
> I do like this "val secondData = rawData.flatMap(_.split("\t").take(3))"
>
> and I find:
> 15/12/10 22:36:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
> 219.216.65.129): java.lang.Cl
I do like this "val secondData = rawData.flatMap(_.split("\t").take(3))"
and I find:
15/12/10 22:36:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
219.216.65.129): java.lang.ClassCastException: java.lang.String cannot be
cast to java.lang.Integer
at scala.runtime.BoxesRunTime.u