cloud-fan commented on a change in pull request #29535:
URL: https://github.com/apache/spark/pull/29535#discussion_r476194909



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala
##########
@@ -822,6 +821,8 @@ class DataFrameReader private[sql](sparkSession: 
SparkSession) extends Logging {
    */
   def table(tableName: String): DataFrame = {
     assertNoSpecifiedSchema("table")
+    for ((k, v) <- this.extraOptions)
+      sparkSession.conf.set(k, v)

Review comment:
       We need to clearly define the behavior here.
   
   I think the options don't make sense if we read temp view, view, or v1 
tables in the session catalog. For v1 table, they are created with `CREATE 
TABLE  ... USING ... OPTIONS ...`, so the options are already there and it's 
confusing to overwrite them per scan.
   
   We do need to keep the scan options when reading v2 tables, but I don't 
think session conf is the right channel. I think we should put the options in 
`UnresolvedRelation` and apply these options when we resolve 
`UnresolvedRelation` to v2 relations.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to