dongjoon-hyun commented on code in PR #40358:
URL: https://github.com/apache/spark/pull/40358#discussion_r1136030130
##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -175,6 +176,26 @@ class ClientE2ETestSuite extends RemoteSparkSession with
SQLHelper {
}
}
+ test("write without table or path") {
+ // Should receive no error to write noop
+ spark.range(10).write.format("noop").mode("append").save()
+ }
+
+ test("write jdbc") {
Review Comment:
From my side,
https://github.com/apache/spark/pull/40358#discussion_r1135938056 also failed.
```
$ build/sbt -Phive -Pconnect package
$ build/sbt "connect-client-jvm/test"
...
[info] ClientE2ETestSuite:
[info] - spark result schema (290 milliseconds)
[info] - spark result array (290 milliseconds)
[info] - eager execution of sql (15 seconds, 819 milliseconds)
[info] - simple dataset (1 second, 28 milliseconds)
[info] - SPARK-42665: Ignore simple udf test until the udf is fully
implemented. !!! IGNORED !!!
[info] - read and write (929 milliseconds)
[info] - read path collision (31 milliseconds)
[info] - write table (4 seconds, 540 milliseconds)
[info] - write without table or path (348 milliseconds)
[info] - write jdbc *** FAILED *** (365 milliseconds)
[info] io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
...
```
In this case, the usual suspect is Java. GitHub Action CI and I'm using Java
8.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]