Yikf commented on code in PR #36941:
URL: https://github.com/apache/spark/pull/36941#discussion_r904480670


##########
sql/core/src/test/scala/org/apache/spark/sql/DataFrameWriterV2Suite.scala:
##########
@@ -531,6 +534,23 @@ class DataFrameWriterV2Suite extends QueryTest with 
SharedSparkSession with Befo
     assert(table.properties === (Map("provider" -> "foo") ++ 
defaultOwnership).asJava)
   }
 
+  test("SPARK-39543 writeOption should be passed to storage properties when 
fallback to v1") {
+    val provider = classOf[InMemoryV1Provider].getName
+
+    withSQLConf((SQLConf.USE_V1_SOURCE_LIST.key, provider)) {

Review Comment:
   Yea, Other tests trigger v1 fallback without set `USE_V1_SOURCE_LIST `, 
AFAIK, 
   - Other tests aim to test the read/write process, and the 
`InMemoryV1Provider` is actually a v2 format, and we trigger v1 fallback at the 
`newScanBuilder` & `newWriteBuilder` layer.
   - This test in PR needs to be fallback to V1 when the table is created, so 
we need to set `USE_V1_SOURCE_LIST`(see: 
[isV2Provider](https://github.com/apache/spark/blob/3d68ad8003a16229bd79f86cb31f618167814a7f/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala#L604))



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to