Thank you,and I find the problem is my package is test,but I write package 
org.apache.spark.examples ,and IDEA had imported the 
spark-examples-1.5.2-hadoop2.6.0.jar ,so I can run it,and it makes lots of 
problems....
______________________________________________________________________
Now , I change the package like this:


package test
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
object test {
  def main(args: Array[String]) {
    val conf = new 
SparkConf().setAppName("mytest").setMaster("spark://Master:7077")
    val sc = new SparkContext(conf)

    sc.addJar("/home/hadoop/spark-assembly-1.5.2-hadoop2.6.0.jar")//It doesn't 
work.!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?????

    val rawData = sc.textFile("/home/hadoop/123.csv")
    val secondData = rawData.flatMap(_.split(",").toString)
    println(secondData.first)   /////line 32
    sc.stop()
  }
}

it causes that: 
15/12/11 18:41:06 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 
219.216.65.129): java.lang.ClassNotFoundException: test.test$$anonfun$1
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)

....

________
//  219.216.65.129 is my worker computer.
//  I can connect to my worker computer.
// Spark can start successfully.
//  addFile is also doesn't work,the tmp file will also dismiss.








At 2015-12-10 22:32:21, "Himanshu Mehra [via Apache Spark User List]" 
<ml-node+s1001560n25667...@n3.nabble.com> wrote:
You are trying to print an array, but anyway it will print the objectID  of the 
array if the input is same as you have shown here. Try flatMap() instead of map 
and check if the problem is same.

   --Himanshu


If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/HELP-I-get-java-lang-String-cannot-be-cast-to-java-lang-Intege-for-a-long-time-tp25666p25667.html
To unsubscribe from HELP! I get "java.lang.String cannot be cast to 
java.lang.Intege " for a long time., click here.
NAML



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/HELP-I-get-java-lang-String-cannot-be-cast-to-java-lang-Intege-for-a-long-time-tp25666p25689.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to