urosstan-db commented on code in PR #52127:
URL: https://github.com/apache/spark/pull/52127#discussion_r2309650509


##########
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/SharedJDBCIntegrationTests.scala:
##########
@@ -17,18 +17,49 @@
 
 package org.apache.spark.sql.jdbc
 
+import java.sql.Connection
+
 import org.apache.spark.SparkException
-import org.apache.spark.sql.QueryTest
+import org.apache.spark.sql.{QueryTest, Row}
 
 trait SharedJDBCIntegrationTests extends QueryTest {
   protected def jdbcUrl: String
 
+  /**
+   * Create a table with the same name that can be used to test common 
functionality
+   * in
+   * @param conn
+   */
+  def createSharedTable(conn: Connection): Unit = {

Review Comment:
   @alekjarmov Can you merge it in another way, we have test cases where JDBC 
connector does not use Docker integration suites, but requires real instance. 
At least logically we can split them now. I suggest next structure:
   ```
   abstract class SharedJDBCIntegrationSuite extends DockerJDBCIntegrationSuite
   
   class PostgresSuite extends SharedJDBCIntegrationSuite
   ```
   
   On that way, you have no double extension, but at least tests are not in 
`DockerJDBCIntegrationSuite`, and we can decouple them from docker easier later



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to