olaky commented on code in PR #39408:
URL: https://github.com/apache/spark/pull/39408#discussion_r1066672989


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningAwareFileIndex.scala:
##########
@@ -76,28 +76,35 @@ abstract class PartitioningAwareFileIndex(
     // be applied to files.
     val fileMetadataFilterOpt = dataFilters.filter { f =>
       f.references.nonEmpty && f.references.forall {
-        case FileSourceMetadataAttribute(_) => true
+        case FileSourceConstantMetadataAttribute(_) => true
         case _ => false
       }
     }.reduceOption(expressions.And)
 
-    // - create a bound references for filters: put the metadata struct at 0 
position for each file
-    // - retrieve the final metadata struct (could be pruned) from filters
+    // - Retrieve all required metadata attributes and put them into a sequence
+    // - Bind all file constant metadata attribute references to their 
respective index
+    val requiredMetadataColumnNames: mutable.Buffer[String] = 
mutable.Buffer.empty
     val boundedFilterMetadataStructOpt = fileMetadataFilterOpt.map { 
fileMetadataFilter =>
-      val metadataStruct = fileMetadataFilter.references.head.dataType
-      val boundedFilter = 
Predicate.createInterpreted(fileMetadataFilter.transform {
-        case _: AttributeReference => BoundReference(0, metadataStruct, 
nullable = true)
+      val metadataStruct = fileMetadataFilter.references.head

Review Comment:
   Outch, this was indeed a wrong reference. The type from the attribute 
reference should be used



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to