Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12856#discussion_r62279016
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningAwareFileCatalog.scala
 ---
    @@ -61,7 +61,31 @@ abstract class PartitioningAwareFileCatalog(
         }
       }
     
    -  override def allFiles(): Seq[FileStatus] = leafFiles.values.toSeq
    +  override def allFiles(): Seq[FileStatus] = {
    +    if (partitionSpec().partitionColumns.isEmpty) {
    +      // For each of the input paths, get the list of files inside them
    +      paths.flatMap { path =>
    +        // Make the path qualified (consistent with listLeafFiles and 
listLeafFilesInParallel).
    +        val fs = path.getFileSystem(hadoopConf)
    +        val qualifiedPath = path.makeQualified(fs.getUri, 
fs.getWorkingDirectory)
    +
    +        // There are three cases possible with each path
    +        // 1. The path is a directory and has children files in it. Then 
it must be present in
    +        //    leafDirToChildrenFiles as those children files will have 
been found as leaf files.
    +        //    Find its children files from leafDirToChildrenFiles and 
include them.
    +        // 2. The path is a file, then it will be present in leafFiles. 
Include this path.
    +        // 3. The path is a directory, but has no children files. Do not 
include this path.
    +
    +        leafDirToChildrenFiles.get(qualifiedPath)
    +          .orElse {
    +            leafFiles.get(path).map(Array(_))
    --- End diff --
    
    Added a new FileCatalogSuite


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to