viirya commented on a change in pull request #34164:
URL: https://github.com/apache/spark/pull/34164#discussion_r725151846



##########
File path: 
external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/V2JDBCTest.scala
##########
@@ -180,5 +180,14 @@ private[v2] trait V2JDBCTest extends SharedSparkSession 
with DockerIntegrationFu
       testCreateTableWithProperty(s"$catalogName.new_table")
     }
   }
+
+  def testIndex(tbl: String): Unit = {}
+
+  test("SPARK-36913: Test INDEX") {
+    withTable(s"$catalogName.new_table") {
+      sql(s"CREATE TABLE $catalogName.new_table(col1 INT, col2 INT, col3 INT, 
col4 INT, col5 INT)")
+      testIndex(s"$catalogName.new_table")

Review comment:
       I am not sure if you want to put all index test code (create, list, 
delete...) together into one test method. Sounds better to have individual test 
methods for different APIs.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to