HyukjinKwon commented on a change in pull request #24752:
[SPARK-27893][SQL][PYTHON] Create an integrated test base for Python, Scalar
Pandas, Scala UDF by sql files
URL: https://github.com/apache/spark/pull/24752#discussion_r289624749
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala
##########
@@ -442,3 +519,172 @@ class SQLQueryTestSuite extends QueryTest with
SharedSQLContext {
}
}
}
+
+
+/**
+ * This object targets to integrate various UDF test cases so that Scalar UDF,
Python UDF and
+ * Scalar Pandas UDFs can be tested in SBT & Maven tests.
+ *
+ * The available UDFs cast input to strings and take one column as input with
a string type
+ * column as output.
+ *
+ * To register Scala UDF in SQL:
+ * {{{
+ * IntegratedUDFTestUtils.registerTestUDF(new TestScalaUDF, spark)
+ * }}}
+ *
+ * To register Python UDF in SQL:
+ * {{{
+ * IntegratedUDFTestUtils.registerTestUDF(new TestPythonUDF, spark)
+ * }}}
+ *
+ * To register Scalar Pandas UDF in SQL:
+ * {{{
+ * IntegratedUDFTestUtils.registerTestUDF(new TestScalarPandasUDF, spark)
+ * }}}
+ *
+ * To use it in Scala API and SQL:
+ * {{{
+ * sql("SELECT udf(1)")
+ * spark.select(expr("udf(1)")
+ * }}}
+ *
+ * They are currently registered as the name 'udf' in function registry.
+ */
+object IntegratedUDFTestUtils extends SQLHelper with Logging {
+ import scala.sys.process._
+
+ lazy val pythonExec: String = {
+ val pythonExec = sys.env.getOrElse("PYSPARK_PYTHON", "python3.6")
Review comment:
`python3.6` is being used in Jenkins .. I think it's better to match it with
PySpark test side.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]