Github user szhem commented on the issue:

    https://github.com/apache/spark/pull/19294
  
    @mridulm Regarding FileFormatWriter I've implemented some basic tests which 
show that
    
    1. [FileFormatWriter 
fails](https://github.com/apache/spark/blob/3f958a99921d149fb9fdf7ba7e78957afdad1405/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileFormatWriter.scala#L118)
 even before setupJob on the committer is called [if the path is 
null](https://github.com/apache/spark/pull/19294/files#diff-bc98a3d91cf4f95f4f473146400044aaR40)
    
           FileOutputFormat.setOutputPath(job, new Path(outputSpec.outputPath))
    
    2. [FileFormatWriter 
succeeds](https://github.com/apache/spark/pull/19294/files#diff-bc98a3d91cf4f95f4f473146400044aaR70)
 in case of default partitioning [when customPath is not 
defined](https://github.com/apache/spark/blob/3f958a99921d149fb9fdf7ba7e78957afdad1405/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileFormatWriter.scala#L501)
 (the second branch of the `if` statement)
    
            val currentPath = if (customPath.isDefined) {
              committer.newTaskTempFileAbsPath(taskAttemptContext, 
customPath.get, ext)
            } else {
              committer.newTaskTempFile(taskAttemptContext, partDir, ext)
            }
    
    3. [FileFormatWriter 
succeeds](https://github.com/apache/spark/pull/19294/files#diff-bc98a3d91cf4f95f4f473146400044aaR107)
 in case of custom partitioning [when customPath is 
defined](https://github.com/apache/spark/blob/3f958a99921d149fb9fdf7ba7e78957afdad1405/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileFormatWriter.scala#L499)
 (the first branch of the `if` statement)
    
            val currentPath = if (customPath.isDefined) {
              committer.newTaskTempFileAbsPath(taskAttemptContext, 
customPath.get, ext)
            } else {
              committer.newTaskTempFile(taskAttemptContext, partDir, ext)
            }
    
    Is there anything else I can help with to be sure nothing else was affected?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to