zhenlineo commented on code in PR #40762:
URL: https://github.com/apache/spark/pull/40762#discussion_r1171740416
##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -102,6 +85,56 @@ object SparkConnectServerUtils {
process
}
+ /**
+ * As one shared spark will be started for all E2E tests, for tests that
needs some special
+ * configs, we add them here
+ */
+ private def testConfigs: Seq[String] = {
+ // Use InMemoryTableCatalog for V2 writer tests
+ val writerV2Configs = {
+ val catalystTestJar = findJar( // To find InMemoryTableCatalog for V2
writer tests
+ "sql/catalyst",
+ "spark-catalyst",
+ "spark-catalyst",
+ test = true).getCanonicalPath
+ Seq(
+ "--jars",
+ catalystTestJar,
+ "--conf",
+
"spark.sql.catalog.testcat=org.apache.spark.sql.connector.catalog.InMemoryTableCatalog")
+ }
+
+ // Run tests using hive
+ val hiveTestConfigs = {
+ val catalogImplementation = if
(IntegrationTestUtils.isSparkHiveJarAvailable) {
+ "hive"
+ } else {
+ // scalastyle:off println
+ println(
+ "Will start Spark Connect server with
`spark.sql.catalogImplementation=in-memory`, " +
+ "some tests that rely on Hive will be ignored. If you don't want
to skip them:\n" +
+ "1. Test with maven: run `build/mvn install -DskipTests -Phive`
before testing\n" +
+ "2. Test with sbt: run test with `-Phive` profile")
+ // scalastyle:on println
+ "in-memory"
+ }
+ Seq("--conf", s"spark.sql.catalogImplementation=$catalogImplementation")
+ }
+
+ // For UDF maven E2E tests, the server needs the client code to find the
UDFs defined in tests.
+ val udfTestConfigs = tryFindJar(
Review Comment:
I tried the following code:
```
class UserDefinedFunctionE2ETestSuite extends RemoteSparkSession {
override def beforeAll(): Unit = {
super.beforeAll()
IntegrationTestUtils.recursiveListFiles(new File(
sparkHome,
"connector/connect/client/jvm/target/scala-2.12/test-classes/org/apache/spark/sql"))
.filter(f => f.getName.startsWith("UserDefinedFunctionE2ETestSuite"))
.foreach(f => spark.client.addArtifact(f.getCanonicalPath))
}
```
The classes are not picked up by the server with this code.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]