Repository: spark
Updated Branches:
  refs/heads/master c7a229d65 -> f08f6f431


[SPARK-23935][SQL][FOLLOWUP] mapEntry throws 
org.codehaus.commons.compiler.CompileException

## What changes were proposed in this pull request?

This PR fixes an exception during the compilation of generated code of 
`mapEntry`. This error occurs since the current code uses `key` type to store a 
`value` when `key` and `value` types are primitive type.

```
     val mid0 = Literal.create(Map(1 -> 1.1, 2 -> 2.2), MapType(IntegerType, 
DoubleType))
     checkEvaluation(MapEntries(mid0), Seq(r(1, 1.1), r(2, 2.2)))
```

```
[info]   Code generation of map_entries(keys: [1,2], values: [1.1,2.2]) failed:
[info]   java.util.concurrent.ExecutionException: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 80, 
Column 20: failed to compile: org.codehaus.commons.compiler.CompileException: 
File 'generated.java', Line 80, Column 20: No applicable constructor/method 
found for actual parameters "int, double"; candidates are: "public void 
org.apache.spark.sql.catalyst.expressions.UnsafeRow.setInt(int, int)", "public 
void org.apache.spark.sql.catalyst.InternalRow.setInt(int, int)"
[info]   java.util.concurrent.ExecutionException: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 80, 
Column 20: failed to compile: org.codehaus.commons.compiler.CompileException: 
File 'generated.java', Line 80, Column 20: No applicable constructor/method 
found for actual parameters "int, double"; candidates are: "public void 
org.apache.spark.sql.catalyst.expressions.UnsafeRow.setInt(int, int)", "public 
void org.apache.spark.sql.catalyst.InternalRow.setInt(int, int)"
[info]          at 
com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:306)
[info]          at 
com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:293)
[info]          at 
com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
[info]          at 
com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
[info]          at 
com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2410)
[info]          at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2380)
[info]          at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
[info]          at 
com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
[info]          at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
[info]          at 
com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
[info]          at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
[info]          at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1290)
...
```

## How was this patch tested?

Added a new test to `CollectionExpressionsSuite`

Closes #22033 from kiszk/SPARK-23935-followup.

Authored-by: Kazuaki Ishizaki <ishiz...@jp.ibm.com>
Signed-off-by: Takuya UESHIN <ues...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f08f6f43
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f08f6f43
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f08f6f43

Branch: refs/heads/master
Commit: f08f6f4314b16fb09c479f6537f99bda77e4c256
Parents: c7a229d
Author: Kazuaki Ishizaki <ishiz...@jp.ibm.com>
Authored: Wed Aug 8 14:38:55 2018 +0900
Committer: Takuya UESHIN <ues...@databricks.com>
Committed: Wed Aug 8 14:38:55 2018 +0900

----------------------------------------------------------------------
 .../spark/sql/catalyst/expressions/collectionOperations.scala      | 2 +-
 .../sql/catalyst/expressions/CollectionExpressionsSuite.scala      | 2 ++
 2 files changed, 3 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f08f6f43/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
----------------------------------------------------------------------
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index ab06a5a..b37fdc6 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -426,7 +426,7 @@ case class MapEntries(child: Expression) extends 
UnaryExpression with ExpectsInp
     val structSize = UnsafeRow.calculateBitSetWidthInBytes(2) + wordSize * 2
     val structSizeAsLong = structSize + "L"
     val keyTypeName = CodeGenerator.primitiveTypeName(childDataType.keyType)
-    val valueTypeName = CodeGenerator.primitiveTypeName(childDataType.keyType)
+    val valueTypeName = 
CodeGenerator.primitiveTypeName(childDataType.valueType)
 
     val valueAssignment = s"$unsafeRow.set$valueTypeName(1, 
${getValue(values)});"
     val valueAssignmentChecked = if (childDataType.valueContainsNull) {

http://git-wip-us.apache.org/repos/asf/spark/blob/f08f6f43/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala
----------------------------------------------------------------------
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala
index 40487b3..7b345aa 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala
@@ -90,10 +90,12 @@ class CollectionExpressionsSuite extends SparkFunSuite with 
ExpressionEvalHelper
     val mi0 = Literal.create(Map(1 -> 1, 2 -> null, 3 -> 2), 
MapType(IntegerType, IntegerType))
     val mi1 = Literal.create(Map[Int, Int](), MapType(IntegerType, 
IntegerType))
     val mi2 = Literal.create(null, MapType(IntegerType, IntegerType))
+    val mid0 = Literal.create(Map(1 -> 1.1, 2 -> 2.2), MapType(IntegerType, 
DoubleType))
 
     checkEvaluation(MapEntries(mi0), Seq(r(1, 1), r(2, null), r(3, 2)))
     checkEvaluation(MapEntries(mi1), Seq.empty)
     checkEvaluation(MapEntries(mi2), null)
+    checkEvaluation(MapEntries(mid0), Seq(r(1, 1.1), r(2, 2.2)))
 
     // Non-primitive-type keys/values
     val ms0 = Literal.create(Map("a" -> "c", "b" -> null), MapType(StringType, 
StringType))


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to