scarlin-cloudera commented on code in PR #4442:
URL: https://github.com/apache/hive/pull/4442#discussion_r1286174831
##########
ql/src/java/org/apache/hadoop/hive/ql/parse/relnodegen/LateralViewPlan.java:
##########
@@ -114,7 +115,7 @@ public LateralViewPlan(ASTNode lateralView, RelOptCluster
cluster, RelNode input
this.lateralViewRel = HiveTableFunctionScan.create(cluster,
TraitsUtil.getDefaultTraitSet(cluster), ImmutableList.of(inputRel),
udtfCall,
- null, retType, null);
+ null, retType, createColumnMappings(inputRel));
Review Comment:
I'm not sure I understand the question here, but lemme try to explain how I
made this work. I think it fits in with what is meant by column mappings...
The idea of the TableFunctionScan is to create multiple rows from a single
row coming in. The columns in the new rows can be broken down into 2
categories:
1) Columns with "exploded" or "created" data. For instance, if the input
row had a column of type array, each new row might contain an element from that
array.
2) Columns directly from the inputRel. This will be denormalized data and
"joined" to the data created in the columns mentioned in 1)
The column mapping here depends on the columns mentioned in 2). Thus, they
will always map only to inputRel columns (from 2) and never to udtfCall columns
(from 1).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]