Can you check permissions etc as I am able to run r.saveAsTextFile("file:///home/cloudera/tmp/out1") successfully on my machine..
On Fri, Jan 9, 2015 at 10:25 AM, NingjunWang <ningjun.w...@lexisnexis.com> wrote: > I try to save RDD as text file to local file system (Linux) but it does not > work > > Launch spark-shell and run the following > > val r = sc.parallelize(Array("a", "b", "c")) > r.saveAsTextFile("file:///home/cloudera/tmp/out1") > > > IOException: Mkdirs failed to create > > file:/home/cloudera/tmp/out1/_temporary/0/_temporary/attempt_201501082027_0003_m_000000_47 > (exists=false, cwd=file:/var/run/spark/work/app-20150108201046-0021/0) > at > org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442) > at > org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:428) > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908) > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:801) > at > > org.apache.hadoop.mapred.TextOutputFormat.getRecordWriter(TextOutputFormat.java:123) > at > org.apache.spark.SparkHadoopWriter.open(SparkHadoopWriter.scala:90) > at > > org.apache.spark.rdd.PairRDDFunctions$$anonfun$13.apply(PairRDDFunctions.scala:1056) > at > > org.apache.spark.rdd.PairRDDFunctions$$anonfun$13.apply(PairRDDFunctions.scala:1047) > at > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > at org.apache.spark.scheduler.Task.run(Task.scala:56) > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196) > at > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > > > I also try with 4 slash but still get the same error > r.saveAsTextFile("file:////home/cloudera/tmp/out1") > > Please advise > > Ningjun > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Failed-to-save-RDD-as-text-file-to-local-file-system-tp21050.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >