Yaohua628 commented on a change in pull request #34575:
URL: https://github.com/apache/spark/pull/34575#discussion_r753875099



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala
##########
@@ -194,10 +195,22 @@ case class FileSourceScanExec(
     disableBucketedScan: Boolean = false)
   extends DataSourceScanExec {
 
+  lazy val outputMetadataStruct: Option[AttributeReference] =
+    output.collectFirst { case MetadataAttribute(attr) => attr }

Review comment:
       "4 flat columns" was the original design. As per suggestions: changed 
from 4 flat columns to 4 fields under one struct:
   - wrap in `_metadata`, potentially reduce name conflict with user's columns
   - can query all available metadata information by selecting `_metadata`
   - easy to maintain and extend (?)
   
   I can see there's a clear disadvantage in doing struct-type: if users only 
select `_metadata.file_size`, we still need to fill all fields (hurt the 
performance). But it could be improved/fixed in follow-up PRs.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to