Hi, all.
I wrote two programs: A.scala and B.scala.
A.scala writes trained model to HDFS with:
_wcount_rdd.saveAsObjectFile(save_path)
I used the command hadoop fs -ls $save_path, and find a directory named
$save_path:
[root@gd39 spark-0.8.0-incubating] # hadoop fs -ls
/spark_test_data/bayes_model/test_model/_wcount
Found 6 items
-rw-r--r-- 1 root supergroup 0 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/_SUCCESS
-rw-r--r-- 1 root supergroup 1086436 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/part-00000
-rw-r--r-- 1 root supergroup 1074172 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/part-00001
-rw-r--r-- 1 root supergroup 1079387 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/part-00002
-rw-r--r-- 1 root supergroup 1078491 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/part-00003
-rw-r--r-- 1 root supergroup 1077765 2013-11-20 09:51
/spark_test_data/bayes_model/test_model/_wcount/part-00004
B.scala reads data from $save_path with:
val _wcount_rdd = sc.objectFile(model_path, 5)
The problem is: when running the B.scala, I got the following exception:
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException:
Input path does not exist:
hdfs://gd39:9000/spark_test_data/bayes_model/test_model/_wcount
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
at
org.apache.hadoop.mapred.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:40)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:70)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:199)
at org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:29)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:199)
at org.apache.spark.rdd.RDD.take(RDD.scala:766)
at org.apache.spark.rdd.RDD.first(RDD.scala:780)
at BayesClassify$.main(BayesClassify.scala:95)
at BayesClassify.main(BayesClassify.scala)
Could anyone give me some advices?
Sincerely
Yang, Qiang