[
https://issues.apache.org/jira/browse/SPARK-15849?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15324993#comment-15324993
]
Sean Owen commented on SPARK-15849:
-----------------------------------
If I'm right, I don't know if there is a solution except to not use S3.
Sometimes that's pretty viable, like working on a separate distributed file
system like HDFS and treating S3 as an import/export layer.
> FileNotFoundException on _temporary while doing saveAsTable to S3
> -----------------------------------------------------------------
>
> Key: SPARK-15849
> URL: https://issues.apache.org/jira/browse/SPARK-15849
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.1
> Environment: AWS EC2 with spark on yarn and s3 storage
> Reporter: Sandeep
>
> When submitting spark jobs to yarn cluster, I occasionally see these error
> messages while doing saveAsTable. I have tried doing this with
> spark.speculation=false, and get the same error. These errors are similar to
> SPARK-2984, but my jobs are writing to S3(s3n) :
> Caused by: java.io.FileNotFoundException: File
> s3n://xxxxxxx/_temporary/0/task_201606080516_0004_m_000079 does not exist.
> at
> org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:506)
> at
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:360)
> at
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitJob(FileOutputCommitter.java:310)
> at
> org.apache.parquet.hadoop.ParquetOutputCommitter.commitJob(ParquetOutputCommitter.java:46)
> at
> org.apache.spark.sql.execution.datasources.BaseWriterContainer.commitJob(WriterContainer.scala:230)
> at
> org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelation.scala:151)
> ... 42 more
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]