[ https://issues.apache.org/jira/browse/SPARK-15849?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15332034#comment-15332034 ]
Sandeep commented on SPARK-15849: --------------------------------- Thanks Thomas for that comment. I have verified both the things with direct committer : 1. The inconsistency issue no longer occurs 2. I see a 2x speedup too Looking forward to the fix directly in Hadoop, so that the knob doesnt have to be explicitly set > FileNotFoundException on _temporary while doing saveAsTable to S3 > ----------------------------------------------------------------- > > Key: SPARK-15849 > URL: https://issues.apache.org/jira/browse/SPARK-15849 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.6.1 > Environment: AWS EC2 with spark on yarn and s3 storage > Reporter: Sandeep > > When submitting spark jobs to yarn cluster, I occasionally see these error > messages while doing saveAsTable. I have tried doing this with > spark.speculation=false, and get the same error. These errors are similar to > SPARK-2984, but my jobs are writing to S3(s3n) : > Caused by: java.io.FileNotFoundException: File > s3n://xxxxxxx/_temporary/0/task_201606080516_0004_m_000079 does not exist. > at > org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:506) > at > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:360) > at > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitJob(FileOutputCommitter.java:310) > at > org.apache.parquet.hadoop.ParquetOutputCommitter.commitJob(ParquetOutputCommitter.java:46) > at > org.apache.spark.sql.execution.datasources.BaseWriterContainer.commitJob(WriterContainer.scala:230) > at > org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelation.scala:151) > ... 42 more -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org