Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/9762#discussion_r45284891
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/SQLQuerySuite.scala
---
@@ -1479,4 +1481,18 @@ class SQLQuerySuite extends QueryTest with
SQLTestUtils with TestHiveSingleton {
|FROM (SELECT '{"f1": "value1", "f2": 12}' json, 'hello' as str)
test
""".stripMargin), Row("value1", "12", 3.14, "hello"))
}
+
+ test ("SPARK-11633: HiveContext throws TreeNode Exception : Failed to
Copy Node") {
+ val rdd1 = sparkContext.parallelize(Seq( Individual(1, 3),
Individual(2, 1)))
+ val df = hiveContext.createDataFrame(rdd1)
+ df.registerTempTable("foo")
--- End diff --
Hi, @marmbrus
I also noticed your way is more concise, but it will not create LogicalRDD.
Instead, what it created is LocalRelation. That is why I did that in the test
case.
Let me post both logical plans for your convenience.
```
== Analyzed Logical Plan ==
F1: int
Project [F1#0]
Join Inner, Some((F2#2 = F2#9))
Subquery a
Subquery foo2
Project [f1#0,F2#1 AS F2#2]
Subquery foo
LogicalRDD [F1#0,F2#1], MapPartitionsRDD[1] at apply at
Transformer.scala:22
Subquery b
Subquery foo3
Project [F1#12,F2#13 AS F2#9]
Subquery foo
LogicalRDD [F1#12,F2#13], MapPartitionsRDD[1] at apply at
Transformer.scala:22
```
```
== Analyzed Logical Plan ==
F1: int
Project [F1#2]
Join Inner, Some((F2#4 = F2#11))
Subquery a
Subquery foo2
Project [f1#2,F2#3 AS F2#4]
Subquery foo
Project [_1#0 AS F1#2,_2#1 AS F2#3]
LocalRelation [_1#0,_2#1], [[1,3],[2,1]]
Subquery b
Subquery foo3
Project [F1#14,F2#15 AS F2#11]
Subquery foo
Project [_1#0 AS F1#14,_2#1 AS F2#15]
LocalRelation [_1#0,_2#1], [[1,3],[2,1]]
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]