Github user gatorsmile commented on a diff in the pull request: https://github.com/apache/spark/pull/21173#discussion_r185640810 --- Diff: sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCWriteSuite.scala --- @@ -515,4 +515,15 @@ class JDBCWriteSuite extends SharedSQLContext with BeforeAndAfter { }.getMessage assert(e.contains("NULL not allowed for column \"NAME\"")) } + + test("SPARK-23856 Spark jdbc setQueryTimeout option") { + val errMsg = intercept[SparkException] { + spark.range(10000000L).selectExpr("id AS k", "id AS v").coalesce(1).write + .mode(SaveMode.Overwrite) + .option("queryTimeout", 1) + .option("batchsize", Int.MaxValue) + .jdbc(url1, "TEST.TIMEOUTTEST", properties) + }.getMessage + assert(errMsg.contains("Statement was canceled or the session timed out")) + } --- End diff -- Yeah, that depends on how the JDBC drivers implement the API `setQueryTimeout`. We need to document it in the `sql-programming-guide.md`; otherwise, users might open the JIRAs to complain it.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org