ueshin commented on code in PR #40498:
URL: https://github.com/apache/spark/pull/40498#discussion_r1142758041


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:
##########
@@ -183,7 +183,7 @@ class DataFrameReader private[sql] (sparkSession: 
SparkSession) extends Logging
       dataSourceBuilder.setFormat(source)
       userSpecifiedSchema.foreach(schema => 
dataSourceBuilder.setSchema(schema.toDDL))
       extraOptions.foreach { case (k, v) =>
-        dataSourceBuilder.putOptions(k, v)
+        builder.getReadBuilder.putOptions(k, v)

Review Comment:
   should modify `table` method as well?



##########
connector/connect/common/src/main/protobuf/spark/connect/relations.proto:
##########
@@ -148,6 +143,13 @@ message Read {
     // This is only supported by the JDBC data source.
     repeated string predicates = 5;
   }
+
+  // Options for data sources and named table.
+  //
+  // When using for data sources, the context of this map varies based on the
+  // data source format. This options could be empty for valid data source 
format.
+  // The map key is case insensitive.
+  map<string, string> options = 3;

Review Comment:
   I guess just adding the field to `NamedTable` is simpler and doesn't break 
anything?



##########
python/pyspark/sql/connect/plan.py:
##########
@@ -293,7 +293,7 @@ def plan(self, session: "SparkConnectClient") -> 
proto.Relation:
             plan.read.data_source.schema = self._schema
         if self._options is not None and len(self._options) > 0:
             for k, v in self._options.items():
-                plan.read.data_source.options[k] = v
+                plan.read.options[k] = v

Review Comment:
   should modify `Read` as well?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to