Hi,

  I have a simple test spark program as below, the strange thing is that it
runs well under a spark-shell, but will get a runtime error of

java.lang.NoSuchMethodError:

in spark-submit, which indicate the line of:

val maps2=maps.collect.toMap

has problem. But why the compilation has no problem and it works well under
spark-shell(==>maps2: scala.collection.immutable.Map[Int,String] =
Map(269953 -> once, 97 -> a, 451002 -> upon, 117481 -> was, 226916 ->
there, 414413 -> time, 146327 -> king) )? Thanks!

import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.mllib.feature.HashingTF
import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext
import org.apache.spark._
import SparkContext._


val docs=sc.parallelize(Array(Array("once" ,"upon", "a", "time"),
Array("there", "was", "a", "king")))

val hashingTF = new HashingTF()

val maps=docs.flatMap{term=>term.map(ele=>(hashingTF.indexOf(ele),ele))}

val maps2=maps.collect.toMap


Cheers,

Dan

Reply via email to