This is an automated email from the ASF dual-hosted git repository.
ngangam pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/hive.git
The following commit(s) were added to refs/heads/master by this push:
new dc625fd HIVE-25963: Addendum: ported unit test fixes (Sourabh Goyal
via Naveen Gangam)
dc625fd is described below
commit dc625fd3c2d1a5bd7ef046ec6d2f549992d37857
Author: Naveen Gangam <[email protected]>
AuthorDate: Tue Mar 15 18:36:48 2022 -0400
HIVE-25963: Addendum: ported unit test fixes (Sourabh Goyal via Naveen
Gangam)
Ported back commits from
https://github.com/apache/hive/pull/3040/commits
that are missing in master branch
---
.../llap/enforce_constraint_notnull.q.out | 118 +++------------------
1 file changed, 15 insertions(+), 103 deletions(-)
diff --git
a/ql/src/test/results/clientpositive/llap/enforce_constraint_notnull.q.out
b/ql/src/test/results/clientpositive/llap/enforce_constraint_notnull.q.out
index ab50360..ada8b1c 100644
--- a/ql/src/test/results/clientpositive/llap/enforce_constraint_notnull.q.out
+++ b/ql/src/test/results/clientpositive/llap/enforce_constraint_notnull.q.out
@@ -6290,9 +6290,6 @@ STAGE PLANS:
Stage: Stage-1
Tez
#### A masked pattern was here ####
- Edges:
- Reducer 2 <- Map 1 (CUSTOM_SIMPLE_EDGE)
-#### A masked pattern was here ####
Vertices:
Map 1
Map Operator Tree:
@@ -6311,53 +6308,16 @@ STAGE PLANS:
expressions: col1 (type: int)
outputColumnNames: _col0
Statistics: Num rows: 1 Data size: 8 Basic stats:
COMPLETE Column stats: COMPLETE
- Filter Operator
- predicate: enforce_constraint(_col0 is not null)
(type: boolean)
+ File Output Operator
+ compressed: false
Statistics: Num rows: 1 Data size: 8 Basic stats:
COMPLETE Column stats: COMPLETE
- File Output Operator
- compressed: false
- Statistics: Num rows: 1 Data size: 8 Basic stats:
COMPLETE Column stats: COMPLETE
- table:
- input format:
org.apache.hadoop.mapred.TextInputFormat
- output format:
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
- serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
- name: default.tttemp
- Select Operator
- expressions: _col0 (type: int)
- outputColumnNames: i
- Statistics: Num rows: 1 Data size: 8 Basic stats:
COMPLETE Column stats: COMPLETE
- Group By Operator
- aggregations: min(i), max(i), count(1),
count(i), compute_bit_vector_hll(i)
- minReductionHashAggr: 0.4
- mode: hash
- outputColumnNames: _col0, _col1, _col2, _col3,
_col4
- Statistics: Num rows: 1 Data size: 168 Basic
stats: COMPLETE Column stats: COMPLETE
- Reduce Output Operator
- null sort order:
- sort order:
- Statistics: Num rows: 1 Data size: 168 Basic
stats: COMPLETE Column stats: COMPLETE
- value expressions: _col0 (type: int), _col1
(type: int), _col2 (type: bigint), _col3 (type: bigint), _col4 (type: binary)
+ table:
+ input format:
org.apache.hadoop.mapred.TextInputFormat
+ output format:
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
+ serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
+ name: default.tttemp
Execution mode: llap
LLAP IO: no inputs
- Reducer 2
- Execution mode: vectorized, llap
- Reduce Operator Tree:
- Group By Operator
- aggregations: min(VALUE._col0), max(VALUE._col1),
count(VALUE._col2), count(VALUE._col3), compute_bit_vector_hll(VALUE._col4)
- mode: mergepartial
- outputColumnNames: _col0, _col1, _col2, _col3, _col4
- Statistics: Num rows: 1 Data size: 168 Basic stats: COMPLETE
Column stats: COMPLETE
- Select Operator
- expressions: 'LONG' (type: string), UDFToLong(_col0) (type:
bigint), UDFToLong(_col1) (type: bigint), (_col2 - _col3) (type: bigint),
COALESCE(ndv_compute_bit_vector(_col4),0) (type: bigint), _col4 (type: binary)
- outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5
- Statistics: Num rows: 1 Data size: 264 Basic stats: COMPLETE
Column stats: COMPLETE
- File Output Operator
- compressed: false
- Statistics: Num rows: 1 Data size: 264 Basic stats:
COMPLETE Column stats: COMPLETE
- table:
- input format:
org.apache.hadoop.mapred.SequenceFileInputFormat
- output format:
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat
- serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
Stage: Stage-2
Dependency Collection
@@ -6375,10 +6335,6 @@ STAGE PLANS:
Stage: Stage-3
Stats Work
Basic Stats Work:
- Column Stats Desc:
- Columns: i
- Column Types: int
- Table: default.tttemp
PREHOOK: query: explain insert into tttemp select cast(key as int) from src
PREHOOK: type: QUERY
@@ -6398,9 +6354,6 @@ STAGE PLANS:
Stage: Stage-1
Tez
#### A masked pattern was here ####
- Edges:
- Reducer 2 <- Map 1 (CUSTOM_SIMPLE_EDGE)
-#### A masked pattern was here ####
Vertices:
Map 1
Map Operator Tree:
@@ -6411,53 +6364,16 @@ STAGE PLANS:
expressions: UDFToInteger(key) (type: int)
outputColumnNames: _col0
Statistics: Num rows: 500 Data size: 2000 Basic stats:
COMPLETE Column stats: COMPLETE
- Filter Operator
- predicate: enforce_constraint(_col0 is not null) (type:
boolean)
- Statistics: Num rows: 250 Data size: 1000 Basic stats:
COMPLETE Column stats: COMPLETE
- File Output Operator
- compressed: false
- Statistics: Num rows: 250 Data size: 1000 Basic stats:
COMPLETE Column stats: COMPLETE
- table:
- input format:
org.apache.hadoop.mapred.TextInputFormat
- output format:
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
- serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
- name: default.tttemp
- Select Operator
- expressions: _col0 (type: int)
- outputColumnNames: i
- Statistics: Num rows: 250 Data size: 1000 Basic stats:
COMPLETE Column stats: COMPLETE
- Group By Operator
- aggregations: min(i), max(i), count(1), count(i),
compute_bit_vector_hll(i)
- minReductionHashAggr: 0.99
- mode: hash
- outputColumnNames: _col0, _col1, _col2, _col3, _col4
- Statistics: Num rows: 1 Data size: 168 Basic stats:
COMPLETE Column stats: COMPLETE
- Reduce Output Operator
- null sort order:
- sort order:
- Statistics: Num rows: 1 Data size: 168 Basic
stats: COMPLETE Column stats: COMPLETE
- value expressions: _col0 (type: int), _col1 (type:
int), _col2 (type: bigint), _col3 (type: bigint), _col4 (type: binary)
+ File Output Operator
+ compressed: false
+ Statistics: Num rows: 500 Data size: 2000 Basic stats:
COMPLETE Column stats: COMPLETE
+ table:
+ input format:
org.apache.hadoop.mapred.TextInputFormat
+ output format:
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
+ serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
+ name: default.tttemp
Execution mode: vectorized, llap
LLAP IO: all inputs
- Reducer 2
- Execution mode: vectorized, llap
- Reduce Operator Tree:
- Group By Operator
- aggregations: min(VALUE._col0), max(VALUE._col1),
count(VALUE._col2), count(VALUE._col3), compute_bit_vector_hll(VALUE._col4)
- mode: mergepartial
- outputColumnNames: _col0, _col1, _col2, _col3, _col4
- Statistics: Num rows: 1 Data size: 168 Basic stats: COMPLETE
Column stats: COMPLETE
- Select Operator
- expressions: 'LONG' (type: string), UDFToLong(_col0) (type:
bigint), UDFToLong(_col1) (type: bigint), (_col2 - _col3) (type: bigint),
COALESCE(ndv_compute_bit_vector(_col4),0) (type: bigint), _col4 (type: binary)
- outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5
- Statistics: Num rows: 1 Data size: 264 Basic stats: COMPLETE
Column stats: COMPLETE
- File Output Operator
- compressed: false
- Statistics: Num rows: 1 Data size: 264 Basic stats:
COMPLETE Column stats: COMPLETE
- table:
- input format:
org.apache.hadoop.mapred.SequenceFileInputFormat
- output format:
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat
- serde:
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
Stage: Stage-2
Dependency Collection
@@ -6475,10 +6391,6 @@ STAGE PLANS:
Stage: Stage-3
Stats Work
Basic Stats Work:
- Column Stats Desc:
- Columns: i
- Column Types: int
- Table: default.tttemp
PREHOOK: query: drop table tttemp
PREHOOK: type: DROPTABLE