ajithme edited a comment on issue #24142: [SPARK-27194][core] Job failures when 
task attempts do not clean up spark-staging parquet files
URL: https://github.com/apache/spark/pull/24142#issuecomment-474991798
 
 
   @vanzin thanks for clarification, i was just trying to make sure that its is 
the 'bad' idea to delete 
   
   Okay i see the point. So Currently i see in  
org.apache.spark.internal.io.HadoopMapReduceCommitProtocol#commitJob when 
``dynamicPartitionOverwrite`` is true, we do this
   ```
   fs.rename(new Path(stagingDir, part), finalPartPath)
   ```
   This is where we will get the old and the new task file causing duplicates. 
Would it be better if we could eliminate speculated old tasks files from this 
operation.?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to