viirya commented on a change in pull request #34164:
URL: https://github.com/apache/spark/pull/34164#discussion_r720782370
##########
File path:
external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala
##########
@@ -115,4 +119,25 @@ class MySQLIntegrationSuite extends
DockerJDBCIntegrationSuite with V2JDBCTest {
val expectedSchema = new StructType().add("ID", IntegerType, true,
defaultMetadata)
assert(t.schema === expectedSchema)
}
+
+ override def testIndex(tbl: String): Unit = {
+ val loaded = Catalogs.load("mysql", conf)
+ val jdbcTable = loaded.asInstanceOf[TableCatalog]
+ .loadTable(Identifier.of(Array.empty[String], "new_table"))
+ assert(jdbcTable.asInstanceOf[SupportsIndex].indexExists("i1") == false)
Review comment:
Actually all occurrence of `jdbcTable` below are used as
`SupportsIndex`. So you can do
```scala
val jdbcTable = loaded.asInstanceOf[TableCatalog]
.loadTable(Identifier.of(Array.empty[String], "new_table"))
.asInstanceOf[SupportsIndex]
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]