Hi, Akhil,
  Yes, in the build.sbt I wrongly set it to the installed scala version of
2.11.6 on the cluster, fixed now. Thanks!

  Cheers,
  Dan


2015-07-27 2:29 GMT-05:00 Akhil Das <ak...@sigmoidanalytics.com>:

> Whats in your build.sbt? You could be messing with the scala version it
> seems.
>
> Thanks
> Best Regards
>
> On Fri, Jul 24, 2015 at 2:15 AM, Dan Dong <dongda...@gmail.com> wrote:
>
>> Hi,
>>   When I ran with spark-submit the following simple Spark program of:
>> import org.apache.spark.SparkContext._
>> import org.apache.spark.SparkConf
>> import org.apache.spark.rdd.RDD
>> import org.apache.spark.SparkContext
>> import org.apache.spark._
>> import SparkContext._
>>
>> object TEST2{
>> def main(args:Array[String])
>> {
>>      val conf = new SparkConf().setAppName("TEST")
>>      val sc=new SparkContext(conf)
>>
>>      val list=List(("aa",1),("bb",2),("cc",3))
>>      val maps=list.toMap
>>   }
>>
>> }
>>
>> I got java.lang.NoSuchMethodError for the line of "val maps=list.toMap".
>> But in a spark-shell or simply scala, it has no problem:
>>
>> scala> val list=List(("aa",1),("bb",2),("cc",3))
>> list: List[(String, Int)] = List((aa,1), (bb,2), (cc,3))
>>
>> scala> val maps=list.toMap
>> maps: scala.collection.immutable.Map[String,Int] = Map(aa -> 1, bb -> 2,
>> cc -> 3)
>>
>> So to use "toMap" method, what am I missing in spark-submit? I use "sbt
>> package" to compile the program and without problem. Thanks!
>>
>> Cheers,
>> Dan
>>
>>
>

Reply via email to