cloud-fan commented on a change in pull request #27556: [SPARK-25990][SQL] 
ScriptTransformation should handle different data types correctly
URL: https://github.com/apache/spark/pull/27556#discussion_r378937849
 
 

 ##########
 File path: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/ScriptTransformationSuite.scala
 ##########
 @@ -186,6 +191,43 @@ class ScriptTransformationSuite extends SparkPlanTest 
with TestHiveSingleton wit
       rowsDf.select("name").collect())
     assert(uncaughtExceptionHandler.exception.isEmpty)
   }
+
+  test("SPARK-25990: TRANSFORM should handle different data types correctly") {
+    assume(TestUtils.testCommandAvailable("python"))
+    val scriptFilePath = getTestResourcePath("test_script.py")
+
+    withTempView("v") {
+      val df = Seq(
+        (1, "1", 1.0, BigDecimal(1.0), new Timestamp(1)),
+        (2, "2", 2.0, BigDecimal(2.0), new Timestamp(2)),
+        (3, "3", 3.0, BigDecimal(3.0), new Timestamp(3))
+      ).toDF("a", "b", "c", "d", "e") // Note column d's data type is 
Decimal(38, 18)
+      df.createTempView("v")
+
+      val query = sql(
+        s"""
+          |SELECT
+          |TRANSFORM(a, b, c, d, e)
+          |USING 'python $scriptFilePath' AS (a, b, c, d, e)
+          |FROM v
+        """.stripMargin)
+
+      // In Hive1.2, it does not do well on Decimal conversion. For example, 
in this case,
+      // it converts a decimal value's type from Decimal(38, 18) to Decimal(1, 
0). So we need
 
 Review comment:
   This is a little confusing, if Hive 1.2 uses `Decimal(1, 0)` as the type, 
then `cast("decimal(1, 0)")` should be a no-op?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to