rdblue commented on a change in pull request #25651: [SPARK-28948][SQL] Support 
passing all Table metadata in TableProvider
URL: https://github.com/apache/spark/pull/25651#discussion_r324388449
 
 

 ##########
 File path: 
external/avro/src/main/scala/org/apache/spark/sql/v2/avro/AvroDataSourceV2.scala
 ##########
 @@ -35,7 +36,10 @@ class AvroDataSourceV2 extends FileDataSourceV2 {
     AvroTable(tableName, sparkSession, options, paths, None, 
fallbackFileFormat)
   }
 
-  override def getTable(options: CaseInsensitiveStringMap, schema: 
StructType): Table = {
+  override def getTable(
+      options: CaseInsensitiveStringMap,
+      schema: StructType,
+      partitions: Array[Transform]): Table = {
 
 Review comment:
   This interface should pass the table properties. There is no need to pass 
read or write options at this point, unless they can't be separated from table 
properties (as in the `DataFrameReader` case). The read options and write 
options should be passed to the logical plan -- this is added in #25681: 
https://github.com/apache/spark/pull/25681/files#diff-94fbd986b04087223f53697d4b6cab24R275
   
   I propose passing table properties as a string map (java.util) through this 
interface. When the properties come from the metastore, then this is fine. When 
the properties come from `DataFrameReader.option` (or the write equivalent) 
then the original case sensitive map should be passed. Then the read options 
should additionally be passed to the correct plan node so that the physical 
plan can push them into the scan or the write.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to