dongjoon-hyun commented on code in PR #40358:
URL: https://github.com/apache/spark/pull/40358#discussion_r1135946160
##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -175,6 +176,26 @@ class ClientE2ETestSuite extends RemoteSparkSession with
SQLHelper {
}
}
+ test("write without table or path") {
+ // Should receive no error to write noop
+ spark.range(10).write.format("noop").mode("append").save()
+ }
+
+ test("write jdbc") {
Review Comment:
To @zhenlineo , I reproduced the error locally in this way on `branch-3.4`
while the same command works in `master`.
```
$ build/sbt -Phive -Phadoop-3 assembly/package "protobuf/test"
"connect-common/test" "connect/test" "connect-client-jvm/test"
...
[info] ClientE2ETestSuite:
[info] - spark result schema (319 milliseconds)
[info] - spark result array (350 milliseconds)
[info] - eager execution of sql (18 seconds, 3 milliseconds)
[info] - simple dataset (1 second, 194 milliseconds)
[info] - SPARK-42665: Ignore simple udf test until the udf is fully
implemented. !!! IGNORED !!!
[info] - read and write (1 second, 32 milliseconds)
[info] - read path collision (32 milliseconds)
[info] - write table (5 seconds, 349 milliseconds)
[info] - write without table or path (170 milliseconds)
[info] - write jdbc *** FAILED *** (325 milliseconds)
[info] io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]