[jira] [Commented] (HIVE-19463) TezTask - getting groups may fail (PartialGroupNameException in some tests)

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478566#comment-16478566
 ] 

Hive QA commented on HIVE-19463:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923846/HIVE-19463.04.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 14407 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[groupby8] (batchId=77)
org.apache.hive.jdbc.TestActivePassiveHA.testActivePassiveHA (batchId=240)
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testHttpRetryOnServerIdleTimeout 
(batchId=243)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11019/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11019/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11019/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923846 - PreCommit-HIVE-Build

> TezTask - getting groups may fail (PartialGroupNameException in some tests)
> ---
>
> Key: HIVE-19463
> URL: https://issues.apache.org/jira/browse/HIVE-19463
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19463.01.patch, HIVE-19463.02.patch, 
> HIVE-19463.03.patch, HIVE-19463.03.patch, HIVE-19463.04.patch, 
> HIVE-19463.patch
>
>
> {noformat}
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping$PartialGroupNameException:
>  The user name 'hive_test_user' is not found. id: hive_test_user: no such user
> id: hive_test_user: no such user
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.resolvePartialGroupNames(ShellBasedUnixGroupsMapping.java:294)
>  ~[hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:207)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:97)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:384)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:319) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:269) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542)
>  [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) 
> [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286)
>  [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) 
> [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache.get(LocalCache.java:3953) 
> [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) 
> [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875)
>  [guava-19.0.jar:?]
>   at org.apache.hadoop.security.Groups.getGroups(Groups.java:227) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1540)
>  [hadoop-common-3.1.0.jar:?]
>   at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:163) 
> [hive-exec-3.1.0-SNAPSHOT.jar:3.1.0-SNAPSHOT]
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19485) dump directory for non native tables should not be created

2018-05-16 Thread anishek (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

anishek updated HIVE-19485:
---
Attachment: (was: HIVE-19485.2.patch)

> dump directory for non native tables should not be created
> --
>
> Key: HIVE-19485
> URL: https://issues.apache.org/jira/browse/HIVE-19485
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 3.1.0
>Reporter: anishek
>Assignee: anishek
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.1.0
>
> Attachments: HIVE-19485.0.patch, HIVE-19485.1.patch, 
> HIVE-19485.2.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19485) dump directory for non native tables should not be created

2018-05-16 Thread anishek (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

anishek updated HIVE-19485:
---
Attachment: HIVE-19485.2.patch

> dump directory for non native tables should not be created
> --
>
> Key: HIVE-19485
> URL: https://issues.apache.org/jira/browse/HIVE-19485
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 3.1.0
>Reporter: anishek
>Assignee: anishek
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.1.0
>
> Attachments: HIVE-19485.0.patch, HIVE-19485.1.patch, 
> HIVE-19485.2.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19485) dump directory for non native tables should not be created

2018-05-16 Thread anishek (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

anishek updated HIVE-19485:
---
Attachment: HIVE-19485.2.patch

> dump directory for non native tables should not be created
> --
>
> Key: HIVE-19485
> URL: https://issues.apache.org/jira/browse/HIVE-19485
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 3.1.0
>Reporter: anishek
>Assignee: anishek
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.1.0
>
> Attachments: HIVE-19485.0.patch, HIVE-19485.1.patch, 
> HIVE-19485.2.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19463) TezTask - getting groups may fail (PartialGroupNameException in some tests)

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478526#comment-16478526
 ] 

Hive QA commented on HIVE-19463:


| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  7m 
17s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m  
2s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
38s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m 
47s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. 
{color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 
54s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  1m 
20s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m  
0s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m  
0s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
36s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 1s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  4m  
8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 
55s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
12s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 22m 20s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-11019/dev-support/hive-personality.sh
 |
| git revision | master / b329afa |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| modules | C: ql U: ql |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11019/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> TezTask - getting groups may fail (PartialGroupNameException in some tests)
> ---
>
> Key: HIVE-19463
> URL: https://issues.apache.org/jira/browse/HIVE-19463
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19463.01.patch, HIVE-19463.02.patch, 
> HIVE-19463.03.patch, HIVE-19463.03.patch, HIVE-19463.04.patch, 
> HIVE-19463.patch
>
>
> {noformat}
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping$PartialGroupNameException:
>  The user name 'hive_test_user' is not found. id: hive_test_user: no such user
> id: hive_test_user: no such user
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.resolvePartialGroupNames(ShellBasedUnixGroupsMapping.java:294)
>  ~[hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:207)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:97)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:384)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:319) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> 

[jira] [Updated] (HIVE-19570) Multiple inserts using "Group by" and "Distinct" generates incorrect results

2018-05-16 Thread Riju Trivedi (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Riju Trivedi updated HIVE-19570:

Summary: Multiple inserts using "Group by" and "Distinct"  generates 
incorrect results  (was: Multiple inserts using "Group by" generates incorrect 
results)

> Multiple inserts using "Group by" and "Distinct"  generates incorrect results
> -
>
> Key: HIVE-19570
> URL: https://issues.apache.org/jira/browse/HIVE-19570
> Project: Hive
>  Issue Type: Bug
>  Components: Logical Optimizer, Query Processor
>Affects Versions: 1.2.0, 3.0.0
>Reporter: Riju Trivedi
>Priority: Critical
>
> Repro steps:
> {code}
> drop database if exists ax1 cascade;
> create database ax1;
> use ax1;
> CREATE TABLE 
>   tmp1 ( 
>   v1 string , v2 string , v3 string ) 
> ROW FORMAT DELIMITED 
> FIELDS TERMINATED BY '\t' 
> LINES TERMINATED BY '\n' 
> ;
> INSERT INTO tmp1
> VALUES 
> ('a', 'b', 'c1') 
> , ('a', 'b', 'c2') 
> , ('d', 'e', 'f') 
> , ('g', 'h', 'i') 
> ;
> CREATE TABLE 
> tmp_grouped_by_one_col  ( v1 string , cnt__v2 int , cnt__v3 int ) 
> ROW FORMAT DELIMITED 
> FIELDS TERMINATED BY '\t' 
> LINES TERMINATED BY '\n' 
> ;
> CREATE TABLE 
> tmp_grouped_by_two_col ( v1 string , v2 string , cnt__v3 int ) 
> ROW FORMAT DELIMITED 
> FIELDS TERMINATED BY '\t' 
> LINES TERMINATED BY '\n' 
> ;
> CREATE TABLE 
> tmp_grouped_by_all_col ( v1 string , v2 string , v3 string ) 
> ROW FORMAT DELIMITED 
> FIELDS TERMINATED BY '\t' 
> LINES TERMINATED BY '\n' 
> ;
> FROM tmp1
> INSERT INTO tmp_grouped_by_one_col 
> SELECT v1, count(distinct v2), count(distinct v3) 
> GROUP BY v1
> INSERT INTO tmp_grouped_by_all_col 
> SELECT v1, v2, v3
> GROUP BY v1, v2, v3
> ;
> select 'tmp_grouped_by_one_col',count(*) from tmp_grouped_by_one_col
> union all
> select 'tmp_grouped_by_two_col',count(*) from tmp_grouped_by_two_col
> union all
> select 'tmp_grouped_by_all_col',count(*) from tmp_grouped_by_all_col;
> select * from tmp_grouped_by_all_col;
> {code}
> tmp_grouped_by_all_col table should have 4 reocrds but it loads 7 records 
> into the table.
> {code}
> ++++--+
> | tmp_grouped_by_all_col.v1  | tmp_grouped_by_all_col.v2  | 
> tmp_grouped_by_all_col.v3  |
> ++++--+
> | a  | b  | b 
>  |
> | a  | c1 | c1
>  |
> | a  | c2 | c2
>  |
> | d  | e  | e 
>  |
> | d  | f  | f 
>  |
> | g  | h  | h 
>  |
> | g  | i  | i 
>  |
> ++++--+
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478513#comment-16478513
 ] 

Hive QA commented on HIVE-19579:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923842/HIVE-19579.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 14407 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_stats]
 (batchId=159)
org.apache.hadoop.hive.metastore.TestHiveMetaStoreAlterColumnPar.org.apache.hadoop.hive.metastore.TestHiveMetaStoreAlterColumnPar
 (batchId=229)
org.apache.hive.hcatalog.pig.TestSequenceFileHCatStorer.testWriteDate2 
(batchId=196)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11018/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11018/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11018/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923842 - PreCommit-HIVE-Build

> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>
> This:
> {noformat}
> [INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
> [INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
> [INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
> [INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version 
> selected from constraint [3.0.0,))
> {noformat}
> occasionally results in bizarre build failures like this:
> {noformat}
> Failed to read artifact descriptor for 
> org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
> org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
> (https://maven.java.net/content/repositories/snapshots): Failed to transfer 
> file: 
> https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
>  Return code is: 402 , ReasonPhrase:Payment Required.
> {noformat}
> cc [~enis] [~stack] if you guys want to take a look at HBase side



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Attachment: HIVE-19308.5.patch

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch, HIVE-19308.5.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Patch Available  (was: Open)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch, HIVE-19308.5.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Open  (was: Patch Available)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch, HIVE-19308.5.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478503#comment-16478503
 ] 

Hive QA commented on HIVE-19579:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
32s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  7m 
42s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  8m 
17s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  7m 
45s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
7s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  9m 
59s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  8m 
42s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  8m 
42s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} whitespace {color} | {color:red}  0m  
0s{color} | {color:red} The patch has 1 line(s) that end in whitespace. Use git 
apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply 
{color} |
| {color:green}+1{color} | {color:green} xml {color} | {color:green}  0m 
11s{color} | {color:green} The patch has no ill-formed XML file. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  8m 
18s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
11s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 52m 32s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  xml  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-11018/dev-support/hive-personality.sh
 |
| git revision | master / b329afa |
| Default Java | 1.8.0_111 |
| whitespace | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11018/yetus/whitespace-eol.txt
 |
| modules | C: . hbase-handler itests/hcatalog-unit itests/hive-minikdc 
itests/hive-unit itests/hive-unit-hadoop2 itests/qtest itests/qtest-accumulo 
itests/qtest-spark itests/util llap-server U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11018/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>
> This:
> {noformat}
> [INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
> [INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
> [INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
> [INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version 
> selected from constraint [3.0.0,))
> {noformat}
> occasionally results in bizarre build failures like this:
> {noformat}
> Failed to read artifact descriptor for 
> org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
> org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
> (https://maven.java.net/content/repositories/snapshots): Failed to transfer 
> file: 
> https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
>  Return code is: 402 , ReasonPhrase:Payment Required.
> {noformat}
> cc [~enis] [~stack] if you guys want to take a look at HBase side



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18748) Rename table impacts the ACID behavior as table names are not updated in meta-tables.

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-18748:
--
Attachment: HIVE-18748.01-branch-3.patch

> Rename table impacts the ACID behavior as table names are not updated in 
> meta-tables.
> -
>
> Key: HIVE-18748
> URL: https://issues.apache.org/jira/browse/HIVE-18748
> Project: Hive
>  Issue Type: Sub-task
>  Components: HiveServer2, Transactions
>Affects Versions: 3.0.0
>Reporter: Sankar Hariappan
>Assignee: Eugene Koifman
>Priority: Critical
>  Labels: ACID, DDL
> Attachments: HIVE-18748.01-branch-3.patch, HIVE-18748.02.patch, 
> HIVE-18748.03.patch, HIVE-18748.04.patch, HIVE-18748.05.patch
>
>
> ACID implementation uses metatables such as TXN_COMPONENTS, 
> COMPLETED_TXN_COMPONENTS, COMPACTION_QUEUE, COMPLETED_COMPCTION_QUEUE etc to 
> manage ACID operations.
> Per table write ID implementation (HIVE-18192) introduces couple of 
> metatables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID to manage write ids 
> allocated per table.
> Now, when we rename any tables, it is necessary to update the corresponding 
> table names in these metatables as well. Otherwise, ACID table operations 
> won't work properly.
> Since, this change is significant and have other side-effects, we propose to 
> disable rename tables on ACID tables until a fix is figured out.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18748) Rename table impacts the ACID behavior as table names are not updated in meta-tables.

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-18748:
--
Summary: Rename table impacts the ACID behavior as table names are not 
updated in meta-tables.  (was: Rename table impacts the ACID behaviour as table 
names are not updated in meta-tables.)

> Rename table impacts the ACID behavior as table names are not updated in 
> meta-tables.
> -
>
> Key: HIVE-18748
> URL: https://issues.apache.org/jira/browse/HIVE-18748
> Project: Hive
>  Issue Type: Sub-task
>  Components: HiveServer2, Transactions
>Affects Versions: 3.0.0
>Reporter: Sankar Hariappan
>Assignee: Eugene Koifman
>Priority: Critical
>  Labels: ACID, DDL
> Attachments: HIVE-18748.02.patch, HIVE-18748.03.patch, 
> HIVE-18748.04.patch, HIVE-18748.05.patch
>
>
> ACID implementation uses metatables such as TXN_COMPONENTS, 
> COMPLETED_TXN_COMPONENTS, COMPACTION_QUEUE, COMPLETED_COMPCTION_QUEUE etc to 
> manage ACID operations.
> Per table write ID implementation (HIVE-18192) introduces couple of 
> metatables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID to manage write ids 
> allocated per table.
> Now, when we rename any tables, it is necessary to update the corresponding 
> table names in these metatables as well. Otherwise, ACID table operations 
> won't work properly.
> Since, this change is significant and have other side-effects, we propose to 
> disable rename tables on ACID tables until a fix is figured out.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Alan Gates (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478458#comment-16478458
 ] 

Alan Gates commented on HIVE-19579:
---

+1

> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>
> This:
> {noformat}
> [INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
> [INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
> [INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
> [INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version 
> selected from constraint [3.0.0,))
> {noformat}
> occasionally results in bizarre build failures like this:
> {noformat}
> Failed to read artifact descriptor for 
> org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
> org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
> (https://maven.java.net/content/repositories/snapshots): Failed to transfer 
> file: 
> https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
>  Return code is: 402 , ReasonPhrase:Payment Required.
> {noformat}
> cc [~enis] [~stack] if you guys want to take a look at HBase side



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-18866:

Status: Patch Available  (was: Open)

[~prasanth_j] can you review?

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Assignee: Sergey Shelukhin
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, HIVE-18866.patch, 
> perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478452#comment-16478452
 ] 

Sergey Shelukhin commented on HIVE-18866:
-

Added a test and fixed the short impl that was wrong for some reason (even 
after fixing 32 to 16). 

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Assignee: Sergey Shelukhin
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, HIVE-18866.patch, 
> perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478454#comment-16478454
 ] 

Sergey Shelukhin commented on HIVE-18866:
-

Assigned to me for now for convenience of attaching files and pressing buttons, 
most of the patch is still [~gopalv]'s

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Assignee: Sergey Shelukhin
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, HIVE-18866.patch, 
> perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin reassigned HIVE-18866:
---

Assignee: Sergey Shelukhin

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Assignee: Sergey Shelukhin
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, HIVE-18866.patch, 
> perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-18866:

Attachment: HIVE-18866.patch

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Assignee: Sergey Shelukhin
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, HIVE-18866.patch, 
> perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-19417) Modify metastore to have/access persistent tables for stats

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19417?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman reassigned HIVE-19417:
-

Assignee: Eugene Koifman  (was: Steve Yeom)

> Modify metastore to have/access persistent tables for stats
> ---
>
> Key: HIVE-19417
> URL: https://issues.apache.org/jira/browse/HIVE-19417
> Project: Hive
>  Issue Type: Sub-task
>  Components: Transactions
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Eugene Koifman
>Priority: Major
> Attachments: HIVE-19417.01.patch, HIVE-19417.02.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19258) add originals support to MM tables (and make the conversion a metadata only operation)

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19258:

Attachment: HIVE-19258.09.patch

> add originals support to MM tables (and make the conversion a metadata only 
> operation)
> --
>
> Key: HIVE-19258
> URL: https://issues.apache.org/jira/browse/HIVE-19258
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19258.01.patch, HIVE-19258.02.patch, 
> HIVE-19258.03.patch, HIVE-19258.04.patch, HIVE-19258.05.patch, 
> HIVE-19258.06.patch, HIVE-19258.07.patch, HIVE-19258.08.patch, 
> HIVE-19258.08.patch, HIVE-19258.09.patch, HIVE-19258.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19258) add originals support to MM tables (and make the conversion a metadata only operation)

2018-05-16 Thread Sergey Shelukhin (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19258?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478448#comment-16478448
 ] 

Sergey Shelukhin commented on HIVE-19258:
-

Heh, the only time HiveQA actually worked and I accidentally attached the 
previous patch.

> add originals support to MM tables (and make the conversion a metadata only 
> operation)
> --
>
> Key: HIVE-19258
> URL: https://issues.apache.org/jira/browse/HIVE-19258
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19258.01.patch, HIVE-19258.02.patch, 
> HIVE-19258.03.patch, HIVE-19258.04.patch, HIVE-19258.05.patch, 
> HIVE-19258.06.patch, HIVE-19258.07.patch, HIVE-19258.08.patch, 
> HIVE-19258.08.patch, HIVE-19258.09.patch, HIVE-19258.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19463) TezTask - getting groups may fail (PartialGroupNameException in some tests)

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19463:

Attachment: HIVE-19463.04.patch

> TezTask - getting groups may fail (PartialGroupNameException in some tests)
> ---
>
> Key: HIVE-19463
> URL: https://issues.apache.org/jira/browse/HIVE-19463
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19463.01.patch, HIVE-19463.02.patch, 
> HIVE-19463.03.patch, HIVE-19463.03.patch, HIVE-19463.04.patch, 
> HIVE-19463.patch
>
>
> {noformat}
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping$PartialGroupNameException:
>  The user name 'hive_test_user' is not found. id: hive_test_user: no such user
> id: hive_test_user: no such user
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.resolvePartialGroupNames(ShellBasedUnixGroupsMapping.java:294)
>  ~[hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:207)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:97)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:384)
>  [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:319) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:269) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542)
>  [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) 
> [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286)
>  [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) 
> [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache.get(LocalCache.java:3953) 
> [guava-19.0.jar:?]
>   at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) 
> [guava-19.0.jar:?]
>   at 
> com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875)
>  [guava-19.0.jar:?]
>   at org.apache.hadoop.security.Groups.getGroups(Groups.java:227) 
> [hadoop-common-3.1.0.jar:?]
>   at 
> org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1540)
>  [hadoop-common-3.1.0.jar:?]
>   at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:163) 
> [hive-exec-3.1.0-SNAPSHOT.jar:3.1.0-SNAPSHOT]
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19516) TestNegative merge_negative_5 and mm_concatenate are causing timeouts

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19516:

Attachment: HIVE-19418.02.patch

> TestNegative merge_negative_5 and mm_concatenate are causing timeouts
> -
>
> Key: HIVE-19516
> URL: https://issues.apache.org/jira/browse/HIVE-19516
> Project: Hive
>  Issue Type: Bug
>  Components: Testing Infrastructure
>Reporter: Vineet Garg
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19516.01.patch, HIVE-19516.02.patch, 
> HIVE-19516.patch
>
>
> I haven't tried to reproduce this in isolation but it is reproducible if you 
> run in batch on local system 
> {noformat}
> mvn -B test  -Dtest.groups= -Dtest=TestNegativeCliDriver 
> 

[jira] [Updated] (HIVE-19418) add background stats updater similar to compactor

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19418?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19418:

Attachment: HIVE-19418.02.patch

> add background stats updater similar to compactor
> -
>
> Key: HIVE-19418
> URL: https://issues.apache.org/jira/browse/HIVE-19418
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19418.01.patch, HIVE-19418.02.patch, 
> HIVE-19418.patch
>
>
> There's a JIRA HIVE-19416 to add snapshot version to stats for MM/ACID tables 
> to make them usable in a transaction without breaking ACID (for metadata-only 
> optimization). However, stats for ACID tables can still become unusable if 
> e.g. two parallel inserts run - neither sees the data written by the other, 
> so after both finish, the snapshots on either set of stats won't match the 
> current snapshot and the stats will be unusable.
> Additionally, for ACID and non-ACID tables alike, a lot of the stats, with 
> some exceptions like numRows, cannot be aggregated (i.e. you cannot combine 
> ndvs from two inserts), and for ACID even less can be aggregated (you cannot 
> derive min/max if some rows are deleted but you don't scan the rest of the 
> dataset).
> Therefore we will add background logic to metastore (similar to, and 
> partially inside, the ACID compactor) to update stats.
> It will have 3 modes of operation.
> 1) Off.
> 2) Update only the stats that exist but are out of date (generating stats can 
> be expensive, so if the user is only analyzing a subset of tables it should 
> be able to only update that subset). We can simply look at existing stats and 
> only analyze for the relevant partitions and columns.
> 3) On: 2 + create stats for all tables and columns missing stats.
> There will also be a table parameter to skip stats update. 
> In phase 1, the process will operate outside of compactor, and run analyze 
> command on the table. The analyze command will automatically save the stats 
> with ACID snapshot information if needed, based on HIVE-19416, so we don't 
> need to do any special state management and this will work for all table 
> types. However it's also more expensive.
> In phase 2, we can explore adding stats collection during MM compaction that 
> uses a temp table. If we don't have open writers during major compaction (so 
> we overwrite all of the data), the temp table stats can simply be copied over 
> to the main table with correct snapshot information, saving us a table scan.
> In phase 3, we can add custom stats collection logic to full ACID compactor 
> that is not query based, the same way as we'd do for (2). Alternatively we 
> can wait for ACID compactor to become query based and just reuse (2).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19516) TestNegative merge_negative_5 and mm_concatenate are causing timeouts

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19516:

Attachment: HIVE-19516.02.patch

> TestNegative merge_negative_5 and mm_concatenate are causing timeouts
> -
>
> Key: HIVE-19516
> URL: https://issues.apache.org/jira/browse/HIVE-19516
> Project: Hive
>  Issue Type: Bug
>  Components: Testing Infrastructure
>Reporter: Vineet Garg
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19516.01.patch, HIVE-19516.02.patch, 
> HIVE-19516.patch
>
>
> I haven't tried to reproduce this in isolation but it is reproducible if you 
> run in batch on local system 
> {noformat}
> mvn -B test  -Dtest.groups= -Dtest=TestNegativeCliDriver 
> 

[jira] [Updated] (HIVE-19516) TestNegative merge_negative_5 and mm_concatenate are causing timeouts

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19516:

Attachment: (was: HIVE-19418.02.patch)

> TestNegative merge_negative_5 and mm_concatenate are causing timeouts
> -
>
> Key: HIVE-19516
> URL: https://issues.apache.org/jira/browse/HIVE-19516
> Project: Hive
>  Issue Type: Bug
>  Components: Testing Infrastructure
>Reporter: Vineet Garg
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19516.01.patch, HIVE-19516.02.patch, 
> HIVE-19516.patch
>
>
> I haven't tried to reproduce this in isolation but it is reproducible if you 
> run in batch on local system 
> {noformat}
> mvn -B test  -Dtest.groups= -Dtest=TestNegativeCliDriver 
> 

[jira] [Updated] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19579:

Assignee: Sergey Shelukhin
  Status: Patch Available  (was: Open)

[~alangates] can you take a look? thnx

> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>
> This:
> {noformat}
> [INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
> [INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
> [INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
> [INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version 
> selected from constraint [3.0.0,))
> {noformat}
> occasionally results in bizarre build failures like this:
> {noformat}
> Failed to read artifact descriptor for 
> org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
> org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
> (https://maven.java.net/content/repositories/snapshots): Failed to transfer 
> file: 
> https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
>  Return code is: 402 , ReasonPhrase:Payment Required.
> {noformat}
> cc [~enis] [~stack] if you guys want to take a look at HBase side



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19579:

Description: 
This:
{noformat}
[INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
[INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
[INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
[INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version selected 
from constraint [3.0.0,))
{noformat}
occasionally results in bizarre build failures like this:
{noformat}
Failed to read artifact descriptor for 
org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
(https://maven.java.net/content/repositories/snapshots): Failed to transfer 
file: 
https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
 Return code is: 402 , ReasonPhrase:Payment Required.
{noformat}

cc [~enis] [~stack] if you guys want to take a look at HBase side

> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>
> This:
> {noformat}
> [INFO] org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT
> [INFO] \- org.apache.hbase:hbase-server:jar:2.0.0-alpha4:compile
> [INFO]\- org.glassfish.web:javax.servlet.jsp:jar:2.3.2:compile
> [INFO]   \- org.glassfish:javax.el:jar:3.0.1-b10:compile (version 
> selected from constraint [3.0.0,))
> {noformat}
> occasionally results in bizarre build failures like this:
> {noformat}
> Failed to read artifact descriptor for 
> org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
> org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
> (https://maven.java.net/content/repositories/snapshots): Failed to transfer 
> file: 
> https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
>  Return code is: 402 , ReasonPhrase:Payment Required.
> {noformat}
> cc [~enis] [~stack] if you guys want to take a look at HBase side



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19579) remove HBase transitive dependency that drags in some snapshot

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19579:

Attachment: HIVE-19579.patch

> remove HBase transitive dependency that drags in some snapshot
> --
>
> Key: HIVE-19579
> URL: https://issues.apache.org/jira/browse/HIVE-19579
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19579.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18748) Rename table impacts the ACID behaviour as table names are not updated in meta-tables.

2018-05-16 Thread Sankar Hariappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478430#comment-16478430
 ] 

Sankar Hariappan commented on HIVE-18748:
-

+1

05.patch looks good to me.

> Rename table impacts the ACID behaviour as table names are not updated in 
> meta-tables.
> --
>
> Key: HIVE-18748
> URL: https://issues.apache.org/jira/browse/HIVE-18748
> Project: Hive
>  Issue Type: Sub-task
>  Components: HiveServer2, Transactions
>Affects Versions: 3.0.0
>Reporter: Sankar Hariappan
>Assignee: Eugene Koifman
>Priority: Critical
>  Labels: ACID, DDL
> Attachments: HIVE-18748.02.patch, HIVE-18748.03.patch, 
> HIVE-18748.04.patch, HIVE-18748.05.patch
>
>
> ACID implementation uses metatables such as TXN_COMPONENTS, 
> COMPLETED_TXN_COMPONENTS, COMPACTION_QUEUE, COMPLETED_COMPCTION_QUEUE etc to 
> manage ACID operations.
> Per table write ID implementation (HIVE-18192) introduces couple of 
> metatables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID to manage write ids 
> allocated per table.
> Now, when we rename any tables, it is necessary to update the corresponding 
> table names in these metatables as well. Otherwise, ACID table operations 
> won't work properly.
> Since, this change is significant and have other side-effects, we propose to 
> disable rename tables on ACID tables until a fix is figured out.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19578) HLL merges tempList on every add

2018-05-16 Thread Prasanth Jayachandran (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478398#comment-16478398
 ] 

Prasanth Jayachandran commented on HIVE-19578:
--

HLL p value is also changing to 10 in HIVE-18079. With that the encoding switch 
threshold is only 153 but temp list's default size is 1024. So there is no 
point maintaining temp list. I will probably remove temp list as it will not 
help and can cut on additional branches. 

> HLL merges tempList on every add
> 
>
> Key: HIVE-19578
> URL: https://issues.apache.org/jira/browse/HIVE-19578
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Prasanth Jayachandran
>Priority: Major
> Attachments: Screen Shot 2018-05-16 at 15.29.12 .png
>
>
>  See comments on HIVE-18866; this has significant perf overhead after the 
> even bigger overhead from hashing is removed.  !Screen Shot 2018-05-16 at 
> 15.29.12 .png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19562) Flaky test: TestMiniSparkOnYarn FileNotFoundException in spark-submit

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478336#comment-16478336
 ] 

Hive QA commented on HIVE-19562:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923824/HIVE-19562.3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11017/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11017/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11017/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:22.204
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11017/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:22.208
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:04:23.334
+ rm -rf ../yetus_PreCommit-HIVE-Build-11017
+ mkdir ../yetus_PreCommit-HIVE-Build-11017
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11017
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11017/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/data/conf/spark/local/hive-site.xml: does not exist in index
error: a/data/conf/spark/standalone/hive-site.xml: does not exist in index
error: a/data/conf/spark/yarn-cluster/hive-site.xml: does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5358732397349962615.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5358732397349962615.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 

[jira] [Commented] (HIVE-18652) Print Spark metrics on console

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18652?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478332#comment-16478332
 ] 

Hive QA commented on HIVE-18652:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923823/HIVE-18652.5.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11016/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11016/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11016/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:01:00.052
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11016/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:01:00.055
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 01:01:01.212
+ rm -rf ../yetus_PreCommit-HIVE-Build-11016
+ mkdir ../yetus_PreCommit-HIVE-Build-11016
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11016
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11016/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkTask.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/MetricsCollection.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:
 does not exist in index
error: 
a/spark-client/src/test/java/org/apache/hive/spark/client/TestMetricsCollection.java:
 does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc783054537821118541.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc783054537821118541.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 

[jira] [Commented] (HIVE-18117) Create TestCliDriver for HDFS EC

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18117?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478319#comment-16478319
 ] 

Hive QA commented on HIVE-18117:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923822/HIVE-18117.7.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11015/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11015/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11015/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:50:57.327
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11015/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:50:57.330
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:50:58.151
+ rm -rf ../yetus_PreCommit-HIVE-Build-11015
+ mkdir ../yetus_PreCommit-HIVE-Build-11015
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11015
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11015/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
/data/hiveptest/working/scratch/build.patch:826: new blank line at EOF.
+
/data/hiveptest/working/scratch/build.patch:882: new blank line at EOF.
+
warning: 2 lines add whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5844485372708083998.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5844485372708083998.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g

[jira] [Updated] (HIVE-19562) Flaky test: TestMiniSparkOnYarn FileNotFoundException in spark-submit

2018-05-16 Thread Sahil Takiar (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19562?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sahil Takiar updated HIVE-19562:

Attachment: HIVE-19562.3.patch

> Flaky test: TestMiniSparkOnYarn FileNotFoundException in spark-submit
> -
>
> Key: HIVE-19562
> URL: https://issues.apache.org/jira/browse/HIVE-19562
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-19562.1.patch, HIVE-19562.3.patch
>
>
> Seeing sporadic failures during test setup. Specifically, when spark-submit 
> runs this error (or a similar error) gets thrown:
> {code}
> 2018-05-15T10:55:02,112  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient: Exception in thread "main" 
> java.io.FileNotFoundException: File 
> file:/tmp/spark-56e217f7-b8a5-4c63-9a6b-d737a64f2820/__spark_libs__7371510645900072447.zip
>  does not exist
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:867)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:442)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:316)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.copyFileToRemote(Client.scala:356)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:478)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:565)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:863)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.Client.run(Client.scala:1146)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1518)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
> 2018-05-15T10:55:02,113  INFO 
> [RemoteDriver-stderr-redir-27d3dcfb-2a10-4118-9fae-c200d2e095a5 main] 
> client.SparkSubmitSparkClient:  at 
> 

[jira] [Updated] (HIVE-18652) Print Spark metrics on console

2018-05-16 Thread Sahil Takiar (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18652?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sahil Takiar updated HIVE-18652:

Attachment: HIVE-18652.5.patch

> Print Spark metrics on console
> --
>
> Key: HIVE-18652
> URL: https://issues.apache.org/jira/browse/HIVE-18652
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18652.1.patch, HIVE-18652.2.patch, 
> HIVE-18652.3.patch, HIVE-18652.4.patch, HIVE-18652.5.patch
>
>
> For Hive-on-MR, each MR job launched prints out some stats about the job:
> {code}
> INFO  : 2018-02-07 17:51:11,218 Stage-1 map = 0%,  reduce = 0%
> INFO  : 2018-02-07 17:51:18,396 Stage-1 map = 100%,  reduce = 0%, Cumulative 
> CPU 1.87 sec
> INFO  : 2018-02-07 17:51:25,742 Stage-1 map = 100%,  reduce = 100%, 
> Cumulative CPU 4.34 sec
> INFO  : MapReduce Total cumulative CPU time: 4 seconds 340 msec
> INFO  : Ended Job = job_1517865654989_0004
> INFO  : MapReduce Jobs Launched:
> INFO  : Stage-Stage-1: Map: 1  Reduce: 1   Cumulative CPU: 4.34 sec   HDFS 
> Read: 7353 HDFS Write: 151 SUCCESS
> INFO  : Total MapReduce CPU Time Spent: 4 seconds 340 msec
> {code}
> We should do the same for Hive-on-Spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19416) Create single version transactional table metastore statistics for aggregation queries

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19416:
--
Component/s: (was: Hive)
 Transactions

> Create single version transactional table metastore statistics for 
> aggregation queries
> --
>
> Key: HIVE-19416
> URL: https://issues.apache.org/jira/browse/HIVE-19416
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
>
> The system should use only statistics for aggregation queries like count on 
> transactional tables.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19532) Modify Hive Driver/Executor to support transactional-stats-using COUNT aggregation queries

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19532?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19532:
--
Component/s: (was: Hive)
 Transactions

> Modify Hive Driver/Executor to support transactional-stats-using COUNT 
> aggregation queries 
> ---
>
> Key: HIVE-19532
> URL: https://issues.apache.org/jira/browse/HIVE-19532
> Project: Hive
>  Issue Type: Sub-task
>  Components: Transactions
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Fix For: 3.1.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19417) Modify metastore to have/access persistent tables for stats

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19417?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19417:
--
Component/s: (was: Hive)
 Transactions

> Modify metastore to have/access persistent tables for stats
> ---
>
> Key: HIVE-19417
> URL: https://issues.apache.org/jira/browse/HIVE-19417
> Project: Hive
>  Issue Type: Sub-task
>  Components: Transactions
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Attachments: HIVE-19417.01.patch, HIVE-19417.02.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19470) Modify metastore to have application logic to retrieve/update transactional table stats

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19470?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19470:
--
Component/s: (was: Hive)
 Transactions

> Modify metastore to have application logic to retrieve/update transactional 
> table stats 
> 
>
> Key: HIVE-19470
> URL: https://issues.apache.org/jira/browse/HIVE-19470
> Project: Hive
>  Issue Type: Sub-task
>  Components: Transactions
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Fix For: 3.1.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19533) Modify Hive to support transactional-stats-using aggregation queries with all other than COUNT

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19533?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19533:
--
Component/s: (was: Hive)
 Transactions

> Modify Hive to support transactional-stats-using aggregation queries with all 
> other than COUNT
> --
>
> Key: HIVE-19533
> URL: https://issues.apache.org/jira/browse/HIVE-19533
> Project: Hive
>  Issue Type: Sub-task
>  Components: Transactions
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Fix For: 3.1.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18117) Create TestCliDriver for HDFS EC

2018-05-16 Thread Andrew Sherman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18117?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Sherman updated HIVE-18117:
--
Attachment: HIVE-18117.7.patch

> Create TestCliDriver for HDFS EC
> 
>
> Key: HIVE-18117
> URL: https://issues.apache.org/jira/browse/HIVE-18117
> Project: Hive
>  Issue Type: Sub-task
>Reporter: Sahil Takiar
>Assignee: Andrew Sherman
>Priority: Major
> Attachments: HIVE-18117.1.patch, HIVE-18117.2.patch, 
> HIVE-18117.3.patch, HIVE-18117.4.patch, HIVE-18117.5.patch, 
> HIVE-18117.6.patch, HIVE-18117.7.patch
>
>
> Should be able to do something similar to what we do for HDFS encryption.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19416) Create single version transactional table metastore statistics for aggregation queries

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19416:
--
Labels: Transa  (was: )

> Create single version transactional table metastore statistics for 
> aggregation queries
> --
>
> Key: HIVE-19416
> URL: https://issues.apache.org/jira/browse/HIVE-19416
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
>
> The system should use only statistics for aggregation queries like count on 
> transactional tables.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19416) Create single version transactional table metastore statistics for aggregation queries

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman updated HIVE-19416:
--
Labels:   (was: Transa)

> Create single version transactional table metastore statistics for 
> aggregation queries
> --
>
> Key: HIVE-19416
> URL: https://issues.apache.org/jira/browse/HIVE-19416
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
>
> The system should use only statistics for aggregation queries like count on 
> transactional tables.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (HIVE-15967) Add test for Add Partition with data to Acid table

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-15967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman resolved HIVE-15967.
---
Resolution: Fixed

support for Add Partition was added in HIVE-18814 (with tests)

> Add test for Add Partition with data to Acid table
> --
>
> Key: HIVE-15967
> URL: https://issues.apache.org/jira/browse/HIVE-15967
> Project: Hive
>  Issue Type: New Feature
>  Components: Transactions
>Reporter: Eugene Koifman
>Assignee: Eugene Koifman
>Priority: Major
>
> This should in principle work as long as the partition is properly bucketed 
> and uses ORC.  Non-acid to acid conversion (in compaction) should just handle 
> it.
> ORC Schema evolution should handle any missing columns (and ignore extra 
> ones) wrt table schema.
> I doubt there are any checks in place to check compatibility.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-15967) Add test for Add Partition with data to Acid table

2018-05-16 Thread Eugene Koifman (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-15967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Koifman reassigned HIVE-15967:
-

Assignee: Eugene Koifman  (was: Steve Yeom)

> Add test for Add Partition with data to Acid table
> --
>
> Key: HIVE-15967
> URL: https://issues.apache.org/jira/browse/HIVE-15967
> Project: Hive
>  Issue Type: New Feature
>  Components: Transactions
>Reporter: Eugene Koifman
>Assignee: Eugene Koifman
>Priority: Major
>
> This should in principle work as long as the partition is properly bucketed 
> and uses ORC.  Non-acid to acid conversion (in compaction) should just handle 
> it.
> ORC Schema evolution should handle any missing columns (and ignore extra 
> ones) wrt table schema.
> I doubt there are any checks in place to check compatibility.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478305#comment-16478305
 ] 

Hive QA commented on HIVE-19308:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923814/HIVE-19308.4.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11014/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11014/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11014/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Tests exited with: Exception: Patch URL 
https://issues.apache.org/jira/secure/attachment/12923814/HIVE-19308.4.patch 
was found in seen patch url's cache and a test was probably run already on it. 
Aborting...
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923814 - PreCommit-HIVE-Build

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19531) TransactionalValidationListener is getting catalog name from conf instead of table object.

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19531?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478304#comment-16478304
 ] 

Hive QA commented on HIVE-19531:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923806/HIVE-19531.1again.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11013/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11013/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11013/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:35:39.006
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11013/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:35:39.008
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:35:40.115
+ rm -rf ../yetus_PreCommit-HIVE-Build-11013
+ mkdir ../yetus_PreCommit-HIVE-Build-11013
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11013
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11013/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc7391454093226222777.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc7391454093226222777.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 

[jira] [Commented] (HIVE-19576) IHMSHandler.getTable not always fetching the right catalog

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19576?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478300#comment-16478300
 ] 

Hive QA commented on HIVE-19576:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923795/HIVE-19576.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11012/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11012/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11012/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:32:22.735
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11012/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:32:22.738
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:32:23.868
+ rm -rf ../yetus_PreCommit-HIVE-Build-11012
+ mkdir ../yetus_PreCommit-HIVE-Build-11012
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11012
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11012/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc427431022172514758.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc427431022172514758.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does not 

[jira] [Commented] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478295#comment-16478295
 ] 

Hive QA commented on HIVE-19308:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923814/HIVE-19308.4.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11010/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11010/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11010/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:28:23.381
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11010/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:28:23.384
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:28:24.205
+ rm -rf ../yetus_PreCommit-HIVE-Build-11010
+ mkdir ../yetus_PreCommit-HIVE-Build-11010
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11010
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11010/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
/data/hiveptest/working/scratch/build.patch:687: new blank line at EOF.
+
warning: 1 line adds whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc3557855036676912270.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc3557855036676912270.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 

[jira] [Commented] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478297#comment-16478297
 ] 

Hive QA commented on HIVE-18079:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923792/HIVE-18079-branch-3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11011/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11011/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11011/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Tests exited with: Exception: Patch URL 
https://issues.apache.org/jira/secure/attachment/12923792/HIVE-18079-branch-3.patch
 was found in seen patch url's cache and a test was probably run already on it. 
Aborting...
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923792 - PreCommit-HIVE-Build

> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Gopal V
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, HIVE-18079.9.patch
>
>
> HyperLogLog can merge a 14 bit HLL into a 10 bit HLL bitset, because of its 
> mathematical hash distribution & construction.
> Allow the squashing of a 14 bit HLL -> 10 bit HLL without needing a second 
> scan over the data-set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19516) TestNegative merge_negative_5 and mm_concatenate are causing timeouts

2018-05-16 Thread Eugene Koifman (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478293#comment-16478293
 ] 

Eugene Koifman commented on HIVE-19516:
---

Alter table compact just enqueues compaction request.

The contract is that for compaction to work, standalone HMS must be running to 
handle this request.

I don't know how to detect the fact that nothing is reading this request queue.

 

In UTs, I manually run Worker.run() to process the queue entry.  From .q file, 
we could build some UDF that does the same and call it via "select WorkerUDF 
from dual" or something.

 

 

> TestNegative merge_negative_5 and mm_concatenate are causing timeouts
> -
>
> Key: HIVE-19516
> URL: https://issues.apache.org/jira/browse/HIVE-19516
> Project: Hive
>  Issue Type: Bug
>  Components: Testing Infrastructure
>Reporter: Vineet Garg
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19516.01.patch, HIVE-19516.patch
>
>
> I haven't tried to reproduce this in isolation but it is reproducible if you 
> run in batch on local system 
> {noformat}
> mvn -B test  -Dtest.groups= -Dtest=TestNegativeCliDriver 
> 

[jira] [Commented] (HIVE-18652) Print Spark metrics on console

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18652?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478289#comment-16478289
 ] 

Hive QA commented on HIVE-18652:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923564/HIVE-18652.4.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11008/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11008/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11008/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:25:01.538
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11008/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:25:01.540
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:25:02.370
+ rm -rf ../yetus_PreCommit-HIVE-Build-11008
+ mkdir ../yetus_PreCommit-HIVE-Build-11008
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11008
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11008/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkTask.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/MetricsCollection.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:
 does not exist in index
error: 
a/spark-client/src/test/java/org/apache/hive/spark/client/TestMetricsCollection.java:
 does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc4140948192405521577.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc4140948192405521577.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 

[jira] [Commented] (HIVE-19562) Flaky test: TestMiniSparkOnYarn FileNotFoundException in spark-submit

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478285#comment-16478285
 ] 

Hive QA commented on HIVE-19562:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923562/HIVE-19562.1.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11007/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11007/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11007/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:21:44.450
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11007/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:21:44.453
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:21:44.969
+ rm -rf ../yetus_PreCommit-HIVE-Build-11007
+ mkdir ../yetus_PreCommit-HIVE-Build-11007
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11007
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11007/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/data/conf/spark/local/hive-site.xml: does not exist in index
error: a/data/conf/spark/standalone/hive-site.xml: does not exist in index
error: a/data/conf/spark/yarn-cluster/hive-site.xml: does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5032219194133056838.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5032219194133056838.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 

[jira] [Commented] (HIVE-18875) Enable SMB Join by default in Tez

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478282#comment-16478282
 ] 

Hive QA commented on HIVE-18875:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923548/HIVE-18875.3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11006/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11006/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11006/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Tests exited with: Exception: Patch URL 
https://issues.apache.org/jira/secure/attachment/12923548/HIVE-18875.3.patch 
was found in seen patch url's cache and a test was probably run already on it. 
Aborting...
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923548 - PreCommit-HIVE-Build

> Enable SMB Join by default in Tez
> -
>
> Key: HIVE-18875
> URL: https://issues.apache.org/jira/browse/HIVE-18875
> Project: Hive
>  Issue Type: Task
>Reporter: Deepak Jaiswal
>Assignee: Deepak Jaiswal
>Priority: Major
> Attachments: HIVE-18875.1.patch, HIVE-18875.2.patch, 
> HIVE-18875.3.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-12342) Set default value of hive.optimize.index.filter to true

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-12342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478280#comment-16478280
 ] 

Hive QA commented on HIVE-12342:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923761/HIVE-12342.15.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11005/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11005/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11005/

Messages:
{noformat}
 This message was trimmed, see log for full details 
error: test/results/clientpositive/spark/bucketmapjoin12.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin13.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin2.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin3.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin4.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin5.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin7.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin7.q.out_spark: does not 
exist in index
error: test/results/clientpositive/spark/bucketmapjoin8.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin9.q.out: does not exist 
in index
error: test/results/clientpositive/spark/bucketmapjoin_negative.q.out: does not 
exist in index
error: test/results/clientpositive/spark/bucketmapjoin_negative2.q.out: does 
not exist in index
error: test/results/clientpositive/spark/bucketmapjoin_negative3.q.out: does 
not exist in index
error: test/results/clientpositive/spark/bucketsortoptimize_insert_2.q.out: 
does not exist in index
error: test/results/clientpositive/spark/bucketsortoptimize_insert_4.q.out: 
does not exist in index
error: test/results/clientpositive/spark/bucketsortoptimize_insert_6.q.out: 
does not exist in index
error: test/results/clientpositive/spark/bucketsortoptimize_insert_7.q.out: 
does not exist in index
error: test/results/clientpositive/spark/bucketsortoptimize_insert_8.q.out: 
does not exist in index
error: test/results/clientpositive/spark/cbo_simple_select.q.out: does not 
exist in index
error: test/results/clientpositive/spark/column_access_stats.q.out: does not 
exist in index
error: test/results/clientpositive/spark/constprog_partitioner.q.out: does not 
exist in index
error: test/results/clientpositive/spark/constprog_semijoin.q.out: does not 
exist in index
error: test/results/clientpositive/spark/cross_join.q.out: does not exist in 
index
error: test/results/clientpositive/spark/cross_product_check_1.q.out: does not 
exist in index
error: test/results/clientpositive/spark/cross_product_check_2.q.out: does not 
exist in index
error: test/results/clientpositive/spark/dynamic_rdd_cache.q.out: does not 
exist in index
error: test/results/clientpositive/spark/filter_join_breaktask.q.out: does not 
exist in index
error: test/results/clientpositive/spark/groupby_map_ppr.q.out: does not exist 
in index
error: test/results/clientpositive/spark/groupby_map_ppr_multi_distinct.q.out: 
does not exist in index
error: test/results/clientpositive/spark/groupby_multi_single_reducer2.q.out: 
does not exist in index
error: test/results/clientpositive/spark/groupby_multi_single_reducer3.q.out: 
does not exist in index
error: test/results/clientpositive/spark/groupby_position.q.out: does not exist 
in index
error: test/results/clientpositive/spark/groupby_ppr.q.out: does not exist in 
index
error: test/results/clientpositive/spark/groupby_ppr_multi_distinct.q.out: does 
not exist in index
error: test/results/clientpositive/spark/groupby_resolution.q.out: does not 
exist in index
error: test/results/clientpositive/spark/groupby_sort_1_23.q.out: does not 
exist in index
error: test/results/clientpositive/spark/groupby_sort_skew_1_23.q.out: does not 
exist in index
error: test/results/clientpositive/spark/having.q.out: does not exist in index
error: test/results/clientpositive/spark/identity_project_remove_skip.q.out: 
does not exist in index
error: test/results/clientpositive/spark/infer_bucket_sort_map_operators.q.out: 
does not exist in index
error: test/results/clientpositive/spark/infer_bucket_sort_num_buckets.q.out: 
does not exist in index
error: test/results/clientpositive/spark/innerjoin.q.out: does not exist in 
index
error: test/results/clientpositive/spark/insert1.q.out: does not exist in index
error: test/results/clientpositive/spark/insert_into2.q.out: does not exist in 
index
error: test/results/clientpositive/spark/join0.q.out: does not exist in index
error: test/results/clientpositive/spark/join1.q.out: does not exist in 

[jira] [Commented] (HIVE-19421) Upgrade version of Jetty to 9.3.20.v20170531

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478278#comment-16478278
 ] 

Hive QA commented on HIVE-19421:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923760/HIVE-19421.4.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11004/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11004/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11004/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:16:23.761
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11004/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:16:23.764
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:16:24.967
+ rm -rf ../yetus_PreCommit-HIVE-Build-11004
+ mkdir ../yetus_PreCommit-HIVE-Build-11004
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11004
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11004/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc2136069879211912567.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc2136069879211912567.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does 

[jira] [Commented] (HIVE-19463) TezTask - getting groups may fail (PartialGroupNameException in some tests)

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478271#comment-16478271
 ] 

Hive QA commented on HIVE-19463:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923758/HIVE-19463.03.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11003/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11003/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11003/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:12:57.612
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11003/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:12:57.615
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:12:58.427
+ rm -rf ../yetus_PreCommit-HIVE-Build-11003
+ mkdir ../yetus_PreCommit-HIVE-Build-11003
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11003
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11003/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5167326056217493302.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5167326056217493302.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does 

[jira] [Commented] (HIVE-19418) add background stats updater similar to compactor

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19418?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478268#comment-16478268
 ] 

Hive QA commented on HIVE-19418:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923755/HIVE-19418.01.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11002/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11002/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11002/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:09:36.953
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11002/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:09:36.956
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git clean -f -d
Removing ${project.basedir}/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:09:38.075
+ rm -rf ../yetus_PreCommit-HIVE-Build-11002
+ mkdir ../yetus_PreCommit-HIVE-Build-11002
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11002
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11002/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
/data/hiveptest/working/scratch/build.patch:471: trailing whitespace.
 
/data/hiveptest/working/scratch/build.patch:1481: trailing whitespace.
 
warning: 2 lines add whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc8726333768938589461.exe, --version]
protoc-jar: executing: [/tmp/protoc8726333768938589461.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
libprotoc 2.5.0
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 

[jira] [Commented] (HIVE-19516) TestNegative merge_negative_5 and mm_concatenate are causing timeouts

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478263#comment-16478263
 ] 

Hive QA commented on HIVE-19516:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923756/HIVE-19516.01.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11001/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11001/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11001/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:06:14.404
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11001/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:06:14.407
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   cb6dee1..b329afa  master -> origin/master
+ git reset --hard HEAD
HEAD is now at cb6dee1 HIVE-19317 : Handle schema evolution from int like types 
to decimal (Janaki Lahorani, reviewed by Vihang Karajgaonkar)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/master
HEAD is now at b329afa HIVE-19572: Add option to mask stats and data size in q 
files (Jesus Camacho Rodriguez, reviewed by Prasanth Jayachandran) (addendum)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-17 00:06:15.775
+ rm -rf ../yetus_PreCommit-HIVE-Build-11001
+ mkdir ../yetus_PreCommit-HIVE-Build-11001
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11001
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11001/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc741313763993382035.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc741313763993382035.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output 

[jira] [Commented] (HIVE-19258) add originals support to MM tables (and make the conversion a metadata only operation)

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19258?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478259#comment-16478259
 ] 

Hive QA commented on HIVE-19258:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923757/HIVE-19258.08.patch

{color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 14411 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[orc_merge7] 
(batchId=175)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge7]
 (batchId=185)
org.apache.hadoop.hive.ql.exec.tez.TestWorkloadManager.testApplyPlanQpChanges 
(batchId=293)
org.apache.hive.service.server.TestInformationSchemaWithPrivilege.test 
(batchId=238)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11000/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11000/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11000/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 4 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923757 - PreCommit-HIVE-Build

> add originals support to MM tables (and make the conversion a metadata only 
> operation)
> --
>
> Key: HIVE-19258
> URL: https://issues.apache.org/jira/browse/HIVE-19258
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Major
> Attachments: HIVE-19258.01.patch, HIVE-19258.02.patch, 
> HIVE-19258.03.patch, HIVE-19258.04.patch, HIVE-19258.05.patch, 
> HIVE-19258.06.patch, HIVE-19258.07.patch, HIVE-19258.08.patch, 
> HIVE-19258.08.patch, HIVE-19258.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Attachment: HIVE-19308.4.patch

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Patch Available  (was: Open)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Open  (was: Patch Available)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch, HIVE-19308.4.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19258) add originals support to MM tables (and make the conversion a metadata only operation)

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19258?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478239#comment-16478239
 ] 

Hive QA commented on HIVE-19258:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
43s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  6m 
42s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
52s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  1m 
14s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
33s{color} | {color:blue} common in master has 62 extant Findbugs warnings. 
{color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
34s{color} | {color:blue} itests/hive-unit in master has 2 extant Findbugs 
warnings. {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m 
52s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. 
{color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
28s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
8s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  2m 
10s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
54s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
54s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
17s{color} | {color:red} itests/hive-unit: The patch generated 5 new + 165 
unchanged - 0 fixed = 170 total (was 165) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
47s{color} | {color:red} ql: The patch generated 15 new + 888 unchanged - 9 
fixed = 903 total (was 897) {color} |
| {color:red}-1{color} | {color:red} whitespace {color} | {color:red}  0m  
0s{color} | {color:red} The patch has 1 line(s) that end in whitespace. Use git 
apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply 
{color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  5m 
23s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
27s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
13s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 30m 25s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-11000/dev-support/hive-personality.sh
 |
| git revision | master / cb6dee1 |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11000/yetus/diff-checkstyle-itests_hive-unit.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11000/yetus/diff-checkstyle-ql.txt
 |
| whitespace | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11000/yetus/whitespace-eol.txt
 |
| modules | C: common itests/hive-unit ql U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11000/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> add originals support to MM tables (and make the conversion a metadata only 
> operation)
> --
>
> Key: HIVE-19258
> URL: https://issues.apache.org/jira/browse/HIVE-19258
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Sergey 

[jira] [Updated] (HIVE-19578) HLL merges tempList on every add

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19578:

Description:  See comments on HIVE-18866; this has significant perf 
overhead after the even bigger overhead from hashing is removed.  !Screen Shot 
2018-05-16 at 15.29.12 .png!  (was: See comments on HIVE-18866; this has 
significant perf overhead after the even bigger overhead from hashing is 
removed.)

> HLL merges tempList on every add
> 
>
> Key: HIVE-19578
> URL: https://issues.apache.org/jira/browse/HIVE-19578
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Prasanth Jayachandran
>Priority: Major
> Attachments: Screen Shot 2018-05-16 at 15.29.12 .png
>
>
>  See comments on HIVE-18866; this has significant perf overhead after the 
> even bigger overhead from hashing is removed.  !Screen Shot 2018-05-16 at 
> 15.29.12 .png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19578) HLL merges tempList on every add

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin updated HIVE-19578:

Attachment: Screen Shot 2018-05-16 at 15.29.12 .png

> HLL merges tempList on every add
> 
>
> Key: HIVE-19578
> URL: https://issues.apache.org/jira/browse/HIVE-19578
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Prasanth Jayachandran
>Priority: Major
> Attachments: Screen Shot 2018-05-16 at 15.29.12 .png
>
>
> See comments on HIVE-18866; this has significant perf overhead after the even 
> bigger overhead from hashing is removed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-19578) HLL merges tempList on every add

2018-05-16 Thread Sergey Shelukhin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergey Shelukhin reassigned HIVE-19578:
---


> HLL merges tempList on every add
> 
>
> Key: HIVE-19578
> URL: https://issues.apache.org/jira/browse/HIVE-19578
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Prasanth Jayachandran
>Priority: Major
>
> See comments on HIVE-18866; this has significant perf overhead after the even 
> bigger overhead from hashing is removed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19531) TransactionalValidationListener is getting catalog name from conf instead of table object.

2018-05-16 Thread Alan Gates (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19531?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478236#comment-16478236
 ] 

Alan Gates commented on HIVE-19531:
---

Reattaching same patch to get another run, as I don't believe the test failures 
are related to the patch.

> TransactionalValidationListener is getting catalog name from conf instead of 
> table object.
> --
>
> Key: HIVE-19531
> URL: https://issues.apache.org/jira/browse/HIVE-19531
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 3.0.0
>Reporter: Alan Gates
>Assignee: Alan Gates
>Priority: Blocker
> Fix For: 3.0.1
>
> Attachments: HIVE-19531.1again.patch, HIVE-19531.patch
>
>
> TransactionalValidationListener.validateTableStructure get the catalog from 
> the conf file rather than taking it from the passed in table structure.  This 
> causes createTable operations to fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19531) TransactionalValidationListener is getting catalog name from conf instead of table object.

2018-05-16 Thread Alan Gates (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Gates updated HIVE-19531:
--
Attachment: HIVE-19531.1again.patch

> TransactionalValidationListener is getting catalog name from conf instead of 
> table object.
> --
>
> Key: HIVE-19531
> URL: https://issues.apache.org/jira/browse/HIVE-19531
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 3.0.0
>Reporter: Alan Gates
>Assignee: Alan Gates
>Priority: Blocker
> Fix For: 3.0.1
>
> Attachments: HIVE-19531.1again.patch, HIVE-19531.patch
>
>
> TransactionalValidationListener.validateTableStructure get the catalog from 
> the conf file rather than taking it from the passed in table structure.  This 
> causes createTable operations to fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478226#comment-16478226
 ] 

Hive QA commented on HIVE-18079:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923792/HIVE-18079-branch-3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10999/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10999/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10999/

Messages:
{noformat}
 This message was trimmed, see log for full details 
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-16 22:57:28.394
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-10999/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z branch-3 ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-16 22:57:28.396
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   29f57fc..cb6dee1  master -> origin/master
   3c44a38..9cd6258  branch-3   -> origin/branch-3
+ git reset --hard HEAD
HEAD is now at 29f57fc HIVE-19424: NPE In MetaDataFormatters (Alice Fan, 
reviewed by Aihua Xu)
+ git clean -f -d
+ git checkout branch-3
Switched to branch 'branch-3'
Your branch is behind 'origin/branch-3' by 7 commits, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/branch-3
HEAD is now at 9cd6258 HIVE-19317 : Handle schema evolution from int like types 
to decimal (Janaki Lahorani, reviewed by Vihang Karajgaonkar)
+ git merge --ff-only origin/branch-3
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-16 22:57:30.504
+ rm -rf ../yetus_PreCommit-HIVE-Build-10999
+ mkdir ../yetus_PreCommit-HIVE-Build-10999
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-10999
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-10999/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/ql/src/test/queries/clientpositive/bucket_map_join_tez2.q: does not 
exist in index
error: a/ql/src/test/results/clientpositive/autoColumnStats_2.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/autoColumnStats_9.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/bitvector.q.out: does not exist in 
index
error: a/ql/src/test/results/clientpositive/compute_stats_date.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/confirm_initial_tbl_stats.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/cross_join_merge.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/describe_table.q.out: does not 
exist in index
error: 
a/ql/src/test/results/clientpositive/encrypted/encryption_move_tbl.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/hll.q.out: does not exist in index
error: a/ql/src/test/results/clientpositive/llap/acid_no_buckets.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/llap/autoColumnStats_2.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_join1.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_join21.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_join29.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_join30.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_sortmerge_join_6.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/llap/bucket_groupby.q.out: does not 
exist in index
error: a/ql/src/test/results/clientpositive/llap/bucket_map_join_tez1.q.out: 
does not exist in index
error: 

[jira] [Updated] (HIVE-19528) Beeline: When beeline-site.xml is present and the default named url is incorrect, throw an exception instead of relying on resolution via hive-site.xml/beeline-hs2-connec

2018-05-16 Thread Vaibhav Gumashta (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vaibhav Gumashta updated HIVE-19528:

Status: Open  (was: Patch Available)

> Beeline: When beeline-site.xml is present and the default named url is 
> incorrect, throw an exception instead of relying on resolution via 
> hive-site.xml/beeline-hs2-connection.xml 
> ---
>
> Key: HIVE-19528
> URL: https://issues.apache.org/jira/browse/HIVE-19528
> Project: Hive
>  Issue Type: Bug
>  Components: Beeline
>Affects Versions: 3.0.0, 3.1.0
>Reporter: Vaibhav Gumashta
>Assignee: Vaibhav Gumashta
>Priority: Major
> Attachments: HIVE-19528.1.patch, HIVE-19528.2.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19528) Beeline: When beeline-site.xml is present and the default named url is incorrect, throw an exception instead of relying on resolution via hive-site.xml/beeline-hs2-connec

2018-05-16 Thread Vaibhav Gumashta (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vaibhav Gumashta updated HIVE-19528:

Attachment: HIVE-19528.2.patch

> Beeline: When beeline-site.xml is present and the default named url is 
> incorrect, throw an exception instead of relying on resolution via 
> hive-site.xml/beeline-hs2-connection.xml 
> ---
>
> Key: HIVE-19528
> URL: https://issues.apache.org/jira/browse/HIVE-19528
> Project: Hive
>  Issue Type: Bug
>  Components: Beeline
>Affects Versions: 3.0.0, 3.1.0
>Reporter: Vaibhav Gumashta
>Assignee: Vaibhav Gumashta
>Priority: Major
> Attachments: HIVE-19528.1.patch, HIVE-19528.2.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478224#comment-16478224
 ] 

Hive QA commented on HIVE-18079:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923749/HIVE-18079.15.patch

{color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 13 failed/errored test(s), 14358 tests 
executed
*Failed tests:*
{noformat}
TestBeeLineExceptionHandling - did not produce a TEST-*.xml file (likely timed 
out) (batchId=191)
TestBeeLineHistory - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
TestBeelineArgParsing - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
TestClientCommandHookFactory - did not produce a TEST-*.xml file (likely timed 
out) (batchId=191)
TestHiveCli - did not produce a TEST-*.xml file (likely timed out) (batchId=191)
TestHiveSchemaTool - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
TestIncrementalRows - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
TestShutdownHook - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
TestTableOutputFormat - did not produce a TEST-*.xml file (likely timed out) 
(batchId=191)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[smb_cache] 
(batchId=160)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_vector_dynpart_hashjoin_1]
 (batchId=172)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[bucketizedhiveinputformat]
 (batchId=183)
org.apache.hadoop.hive.cli.TestSparkPerfCliDriver.testCliDriver[query39] 
(batchId=255)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10998/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10998/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10998/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 13 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923749 - PreCommit-HIVE-Build

> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Gopal V
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, HIVE-18079.9.patch
>
>
> HyperLogLog can merge a 14 bit HLL into a 10 bit HLL bitset, because of its 
> mathematical hash distribution & construction.
> Allow the squashing of a 14 bit HLL -> 10 bit HLL without needing a second 
> scan over the data-set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18866) Semijoin: Implement a Long -> Hash64 vector fast-path

2018-05-16 Thread Sergey Shelukhin (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18866?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478218#comment-16478218
 ] 

Sergey Shelukhin commented on HIVE-18866:
-

I found the same issue thru some perf testing and tried this patch, it reduces 
certain analyze query DAG runtime from ~1550s. to 450s. Next hotspot is also in 
HLL but not related to hash, I might file a bug later.

> Semijoin: Implement a Long -> Hash64 vector fast-path
> -
>
> Key: HIVE-18866
> URL: https://issues.apache.org/jira/browse/HIVE-18866
> Project: Hive
>  Issue Type: Improvement
>  Components: Vectorization
>Reporter: Gopal V
>Priority: Major
>  Labels: performance
> Attachments: 0001-hash64-WIP.patch, perf-hash64-long.png
>
>
> A significant amount of CPU is wasted with JMM restrictions on byte[] arrays.
> To transform from one Long -> another Long, this goes into a byte[] array, 
> which shows up as a hotspot.
> !perf-hash64-long.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18748) Rename table impacts the ACID behaviour as table names are not updated in meta-tables.

2018-05-16 Thread Eugene Koifman (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478205#comment-16478205
 ] 

Eugene Koifman commented on HIVE-18748:
---

a Green run... (/)

> Rename table impacts the ACID behaviour as table names are not updated in 
> meta-tables.
> --
>
> Key: HIVE-18748
> URL: https://issues.apache.org/jira/browse/HIVE-18748
> Project: Hive
>  Issue Type: Sub-task
>  Components: HiveServer2, Transactions
>Affects Versions: 3.0.0
>Reporter: Sankar Hariappan
>Assignee: Eugene Koifman
>Priority: Critical
>  Labels: ACID, DDL
> Attachments: HIVE-18748.02.patch, HIVE-18748.03.patch, 
> HIVE-18748.04.patch, HIVE-18748.05.patch
>
>
> ACID implementation uses metatables such as TXN_COMPONENTS, 
> COMPLETED_TXN_COMPONENTS, COMPACTION_QUEUE, COMPLETED_COMPCTION_QUEUE etc to 
> manage ACID operations.
> Per table write ID implementation (HIVE-18192) introduces couple of 
> metatables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID to manage write ids 
> allocated per table.
> Now, when we rename any tables, it is necessary to update the corresponding 
> table names in these metatables as well. Otherwise, ACID table operations 
> won't work properly.
> Since, this change is significant and have other side-effects, we propose to 
> disable rename tables on ACID tables until a fix is figured out.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478200#comment-16478200
 ] 

Hive QA commented on HIVE-18079:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
33s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  6m 
29s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
45s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
55s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m 
59s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. 
{color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m  
0s{color} | {color:blue} standalone-metastore in master has 215 extant Findbugs 
warnings. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
58s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
8s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  2m 
11s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
43s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
43s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
18s{color} | {color:red} standalone-metastore: The patch generated 12 new + 38 
unchanged - 1 fixed = 50 total (was 39) {color} |
| {color:red}-1{color} | {color:red} whitespace {color} | {color:red}  0m  
0s{color} | {color:red} The patch has 3 line(s) that end in whitespace. Use git 
apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply 
{color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  7m  
8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  2m  
5s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
12s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 34m 22s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-10998/dev-support/hive-personality.sh
 |
| git revision | master / cb6dee1 |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10998/yetus/diff-checkstyle-standalone-metastore.txt
 |
| whitespace | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10998/yetus/whitespace-eol.txt
 |
| modules | C: ql standalone-metastore U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10998/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Gopal V
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, 

[jira] [Assigned] (HIVE-19577) CREATE TEMPORARY TABLE LIKE and INSERT generate output format mismatch errors

2018-05-16 Thread Steve Yeom (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Yeom reassigned HIVE-19577:
-

Assignee: Steve Yeom

> CREATE TEMPORARY TABLE LIKE  and INSERT generate output format mismatch errors
> --
>
> Key: HIVE-19577
> URL: https://issues.apache.org/jira/browse/HIVE-19577
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Fix For: 3.1.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19576) IHMSHandler.getTable not always fetching the right catalog

2018-05-16 Thread Alan Gates (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Gates updated HIVE-19576:
--
Attachment: HIVE-19576.patch

> IHMSHandler.getTable not always fetching the right catalog
> --
>
> Key: HIVE-19576
> URL: https://issues.apache.org/jira/browse/HIVE-19576
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 3.0.0
>Reporter: Alan Gates
>Assignee: Alan Gates
>Priority: Major
> Fix For: 3.0.1
>
> Attachments: HIVE-19576.patch
>
>
> {{IHMSHandler.get_table_core(String dbName, String tableName)}} fetches the 
> catalog name from the conf.  This causes issues when doing an operation where 
> the catalog is known and does not match the default provided in the 
> configuration file (e.g. adding a partition).  This method should be removed 
> and callers forced to use {{IHMSHandler.get_table_core(String catName, String 
> dbName, String tableName)}} instead since callers will know whether they have 
> the catalog name or not.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19576) IHMSHandler.getTable not always fetching the right catalog

2018-05-16 Thread Alan Gates (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Gates updated HIVE-19576:
--
Status: Patch Available  (was: Open)

> IHMSHandler.getTable not always fetching the right catalog
> --
>
> Key: HIVE-19576
> URL: https://issues.apache.org/jira/browse/HIVE-19576
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 3.0.0
>Reporter: Alan Gates
>Assignee: Alan Gates
>Priority: Major
> Fix For: 3.0.1
>
> Attachments: HIVE-19576.patch
>
>
> {{IHMSHandler.get_table_core(String dbName, String tableName)}} fetches the 
> catalog name from the conf.  This causes issues when doing an operation where 
> the catalog is known and does not match the default provided in the 
> configuration file (e.g. adding a partition).  This method should be removed 
> and callers forced to use {{IHMSHandler.get_table_core(String catName, String 
> dbName, String tableName)}} instead since callers will know whether they have 
> the catalog name or not.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-19576) IHMSHandler.getTable not always fetching the right catalog

2018-05-16 Thread Alan Gates (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Gates reassigned HIVE-19576:
-


> IHMSHandler.getTable not always fetching the right catalog
> --
>
> Key: HIVE-19576
> URL: https://issues.apache.org/jira/browse/HIVE-19576
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 3.0.0
>Reporter: Alan Gates
>Assignee: Alan Gates
>Priority: Major
> Fix For: 3.0.1
>
>
> {{IHMSHandler.get_table_core(String dbName, String tableName)}} fetches the 
> catalog name from the conf.  This causes issues when doing an operation where 
> the catalog is known and does not match the default provided in the 
> configuration file (e.g. adding a partition).  This method should be removed 
> and callers forced to use {{IHMSHandler.get_table_core(String catName, String 
> dbName, String tableName)}} instead since callers will know whether they have 
> the catalog name or not.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19317) Handle schema evolution from int like types to decimal

2018-05-16 Thread Vihang Karajgaonkar (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vihang Karajgaonkar updated HIVE-19317:
---
   Resolution: Fixed
Fix Version/s: 3.1.0
   Status: Resolved  (was: Patch Available)

merged into master and branch-3. Thanks for your contribution [~janulatha]

> Handle schema evolution from int like types to decimal
> --
>
> Key: HIVE-19317
> URL: https://issues.apache.org/jira/browse/HIVE-19317
> Project: Hive
>  Issue Type: Bug
>Reporter: Janaki Lahorani
>Assignee: Janaki Lahorani
>Priority: Major
> Fix For: 3.1.0
>
> Attachments: HIVE-19317.1.patch, HIVE-19317.2.patch, 
> HIVE-19317.3.patch, HIVE-19317.4.patch, HIVE-19317.5.patch
>
>
> If int like type is changed to decimal on parquet data, select results in 
> errors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran reassigned HIVE-18079:


Assignee: Gopal V  (was: Prasanth Jayachandran)

> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Gopal V
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, HIVE-18079.9.patch
>
>
> HyperLogLog can merge a 14 bit HLL into a 10 bit HLL bitset, because of its 
> mathematical hash distribution & construction.
> Allow the squashing of a 14 bit HLL -> 10 bit HLL without needing a second 
> scan over the data-set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-18079:
-
Attachment: HIVE-18079-branch-3.patch

> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Prasanth Jayachandran
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, HIVE-18079.9.patch
>
>
> HyperLogLog can merge a 14 bit HLL into a 10 bit HLL bitset, because of its 
> mathematical hash distribution & construction.
> Allow the squashing of a 14 bit HLL -> 10 bit HLL without needing a second 
> scan over the data-set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (HIVE-18079) Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator bit-size

2018-05-16 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran reassigned HIVE-18079:


Assignee: Prasanth Jayachandran  (was: Gopal V)

> Statistics: Allow HyperLogLog to be merged to the lowest-common-denominator 
> bit-size
> 
>
> Key: HIVE-18079
> URL: https://issues.apache.org/jira/browse/HIVE-18079
> Project: Hive
>  Issue Type: Improvement
>  Components: Standalone Metastore, Statistics
>Affects Versions: 3.0.0
>Reporter: Gopal V
>Assignee: Prasanth Jayachandran
>Priority: Major
> Attachments: HIVE-18079-branch-3.patch, HIVE-18079.1.patch, 
> HIVE-18079.10.patch, HIVE-18079.11.patch, HIVE-18079.12.patch, 
> HIVE-18079.13.patch, HIVE-18079.14.patch, HIVE-18079.15.patch, 
> HIVE-18079.2.patch, HIVE-18079.4.patch, HIVE-18079.5.patch, 
> HIVE-18079.6.patch, HIVE-18079.7.patch, HIVE-18079.8.patch, HIVE-18079.9.patch
>
>
> HyperLogLog can merge a 14 bit HLL into a 10 bit HLL bitset, because of its 
> mathematical hash distribution & construction.
> Allow the squashing of a 14 bit HLL -> 10 bit HLL without needing a second 
> scan over the data-set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18748) Rename table impacts the ACID behaviour as table names are not updated in meta-tables.

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478159#comment-16478159
 ] 

Hive QA commented on HIVE-18748:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923744/HIVE-18748.05.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:green}SUCCESS:{color} +1 due to 14408 tests passed

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10996/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10996/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10996/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923744 - PreCommit-HIVE-Build

> Rename table impacts the ACID behaviour as table names are not updated in 
> meta-tables.
> --
>
> Key: HIVE-18748
> URL: https://issues.apache.org/jira/browse/HIVE-18748
> Project: Hive
>  Issue Type: Sub-task
>  Components: HiveServer2, Transactions
>Affects Versions: 3.0.0
>Reporter: Sankar Hariappan
>Assignee: Eugene Koifman
>Priority: Critical
>  Labels: ACID, DDL
> Attachments: HIVE-18748.02.patch, HIVE-18748.03.patch, 
> HIVE-18748.04.patch, HIVE-18748.05.patch
>
>
> ACID implementation uses metatables such as TXN_COMPONENTS, 
> COMPLETED_TXN_COMPONENTS, COMPACTION_QUEUE, COMPLETED_COMPCTION_QUEUE etc to 
> manage ACID operations.
> Per table write ID implementation (HIVE-18192) introduces couple of 
> metatables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID to manage write ids 
> allocated per table.
> Now, when we rename any tables, it is necessary to update the corresponding 
> table names in these metatables as well. Otherwise, ACID table operations 
> won't work properly.
> Since, this change is significant and have other side-effects, we propose to 
> disable rename tables on ACID tables until a fix is figured out.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Attachment: HIVE-19308.3.patch

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Open  (was: Patch Available)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Wohlstadter updated HIVE-19308:

Status: Patch Available  (was: Open)

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch, 
> HIVE-19308.3.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18748) Rename table impacts the ACID behaviour as table names are not updated in meta-tables.

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478116#comment-16478116
 ] 

Hive QA commented on HIVE-18748:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
39s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  6m 
43s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
44s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  1m 
 2s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m 
58s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. 
{color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  2m 
56s{color} | {color:blue} standalone-metastore in master has 215 extant 
Findbugs warnings. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  2m  
0s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
8s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  2m 
11s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
41s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
41s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
38s{color} | {color:red} ql: The patch generated 3 new + 2 unchanged - 0 fixed 
= 5 total (was 2) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
21s{color} | {color:red} standalone-metastore: The patch generated 3 new + 620 
unchanged - 0 fixed = 623 total (was 620) {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red}  3m  
3s{color} | {color:red} standalone-metastore generated 1 new + 215 unchanged - 
0 fixed = 216 total (was 215) {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
59s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
12s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 33m 57s{color} | 
{color:black} {color} |
\\
\\
|| Reason || Tests ||
| FindBugs | module:standalone-metastore |
|  |  org.apache.hadoop.hive.metastore.txn.TxnHandler.onRename(String, String, 
String, String, String, String, String, String) passes a nonconstant String to 
an execute or addBatch method on an SQL statement  At TxnHandler.java:String, 
String, String) passes a nonconstant String to an execute or addBatch method on 
an SQL statement  At TxnHandler.java:[line 2918] |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-10996/dev-support/hive-personality.sh
 |
| git revision | master / 3d83467 |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10996/yetus/diff-checkstyle-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10996/yetus/diff-checkstyle-standalone-metastore.txt
 |
| findbugs | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10996/yetus/new-findbugs-standalone-metastore.html
 |
| modules | C: ql standalone-metastore U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10996/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> Rename table impacts the ACID behaviour as table names are not updated in 
> meta-tables.
> 

[jira] [Commented] (HIVE-19575) TestAutoPurgeTables seems flaky

2018-05-16 Thread Jesus Camacho Rodriguez (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478099#comment-16478099
 ] 

Jesus Camacho Rodriguez commented on HIVE-19575:


Pushed an addendum to try the retry, since it was not being triggered correctly 
because of the @Before / @After annotations. If that does not work either, I 
will proceed to disable the flaky tests and add it to the list in HIVE-19509.

> TestAutoPurgeTables seems flaky
> ---
>
> Key: HIVE-19575
> URL: https://issues.apache.org/jira/browse/HIVE-19575
> Project: Hive
>  Issue Type: Bug
>  Components: Test
>Affects Versions: 3.1.0
>Reporter: Prasanth Jayachandran
>Assignee: Prasanth Jayachandran
>Priority: Major
> Fix For: 3.1.0
>
> Attachments: HIVE-19575.1.patch
>
>
> I cannot reproduce the flakiness locally. Maybe we can retry this flaky test 
> using RetryTestRunner. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19424) NPE In MetaDataFormatters

2018-05-16 Thread Aihua Xu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19424?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19424:

   Resolution: Fixed
Fix Version/s: 3.1.0
   Status: Resolved  (was: Patch Available)

Pushed to master. Thanks [~afan] for the work.

> NPE In MetaDataFormatters
> -
>
> Key: HIVE-19424
> URL: https://issues.apache.org/jira/browse/HIVE-19424
> Project: Hive
>  Issue Type: Bug
>  Components: HiveServer2, Metastore, Standalone Metastore
>Affects Versions: 3.0.0, 2.4.0
>Reporter: BELUGA BEHR
>Assignee: Alice Fan
>Priority: Minor
> Fix For: 3.1.0
>
> Attachments: HIVE-19424.1.patch, HIVE-19424.2.patch
>
>
> h2. Overview
> According to the Hive Schema definition, a table's {{INPUT_FORMAT}} class can 
> be set to NULL.  However, there are places in the code where we do not 
> account for this NULL value, in particular the {{MetaDataFormatters}} classes 
> {{TextMetaDataFormatter}} and {{JsonMetaDataFormatter}}.  In addition, there 
> is no debug level logging in the {{MetaDataFormatters}} classes to tell me 
> which table in particular is causing the problem.
> {code:sql|title=hive-schema-2.2.0.mysql.sql}
> CREATE TABLE IF NOT EXISTS `SDS` (
>   `SD_ID` bigint(20) NOT NULL,
>   `CD_ID` bigint(20) DEFAULT NULL,
>   `INPUT_FORMAT` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin 
> DEFAULT NULL,
>   `IS_COMPRESSED` bit(1) NOT NULL,
> ...
> {code}
> {code:java|title=TextMetaDataFormatter.java}
> // Not checking for a null return from getInputFormatClass
> inputFormattCls = par.getInputFormatClass().getName();
> outputFormattCls = par.getOutputFormatClass().getName();
> {code}
> h2. Reproduction
> {code:sql}
> -- MySQL Backend
> update SDS SET INPUT_FORMAT=NULL WHERE SD_ID=XXX;
> {code}
> {code}
> // Hive
> SHOW TABLE EXTENDED FROM default LIKE '*';
> // HS2 Logs
> [HiveServer2-Background-Pool: Thread-464]: Error running hive query: 
> org.apache.hive.service.cli.HiveSQLException: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Exception while processing show table 
> status
>   at 
> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:400)
>   at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:238)
>   at 
> org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89)
>   at 
> org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
>   at 
> org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Exception while 
> processing show table status
>   at 
> org.apache.hadoop.hive.ql.exec.DDLTask.showTableStatus(DDLTask.java:3025)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:405)
>   at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
>   at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
>   at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2052)
>   at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1748)
>   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1285)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1280)
>   at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236)
>   ... 11 more
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.hadoop.hive.ql.metadata.formatting.TextMetaDataFormatter.showTableStatus(TextMetaDataFormatter.java:202)
>   at 
> org.apache.hadoop.hive.ql.exec.DDLTask.showTableStatus(DDLTask.java:3020)
>   ... 20 more
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19572) Add option to mask stats and data size in q files

2018-05-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478070#comment-16478070
 ] 

Hive QA commented on HIVE-19572:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12923742/HIVE-19572.01.patch

{color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 5 failed/errored test(s), 14397 tests 
executed
*Failed tests:*
{noformat}
TestMinimrCliDriver - did not produce a TEST-*.xml file (likely timed out) 
(batchId=94)

[infer_bucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,parallel_orderby.q,bucket_num_reducers_acid.q,scriptfile1.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,root_dir_external_table.q,infer_bucket_sort_dyn_part.q,udf_using.q]
org.apache.hadoop.hive.cli.TestSparkNegativeCliDriver.testCliDriver[spark_stage_max_tasks]
 (batchId=255)
org.apache.hadoop.hive.cli.TestSparkNegativeCliDriver.testCliDriver[spark_task_failure]
 (batchId=255)
org.apache.hadoop.hive.cli.TestSparkPerfCliDriver.testCliDriver[query39] 
(batchId=255)
org.apache.hadoop.hive.ql.TestAutoPurgeTables.testExternalNoAutoPurge 
(batchId=233)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10995/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10995/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10995/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 5 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12923742 - PreCommit-HIVE-Build

> Add option to mask stats and data size in q files
> -
>
> Key: HIVE-19572
> URL: https://issues.apache.org/jira/browse/HIVE-19572
> Project: Hive
>  Issue Type: Improvement
>  Components: Testing Infrastructure
>Affects Versions: 3.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
>Priority: Major
> Fix For: 3.1.0
>
> Attachments: HIVE-19572.01.patch
>
>
> Some tests are flaky because of minimal data size differences, e.g., one 
> byte. However, many times we do not actually care about these differences. 
> One example is {{default_constraint.q}}.
> Patch adds the possibility to mask 1) printing of stats selectively on q 
> files by adding the {{-- MASK_STATS}} option, and 2) printing of data size 
> stats selectively on q files by adding the {{-- MASK_DATA_SIZE}} option.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19572) Add option to mask stats and data size in q files

2018-05-16 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19572?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jesus Camacho Rodriguez updated HIVE-19572:
---
   Resolution: Fixed
Fix Version/s: 3.1.0
   Status: Resolved  (was: Patch Available)

Pushed to master, thanks for reviewing [~prasanth_j]

> Add option to mask stats and data size in q files
> -
>
> Key: HIVE-19572
> URL: https://issues.apache.org/jira/browse/HIVE-19572
> Project: Hive
>  Issue Type: Improvement
>  Components: Testing Infrastructure
>Affects Versions: 3.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
>Priority: Major
> Fix For: 3.1.0
>
> Attachments: HIVE-19572.01.patch
>
>
> Some tests are flaky because of minimal data size differences, e.g., one 
> byte. However, many times we do not actually care about these differences. 
> One example is {{default_constraint.q}}.
> Patch adds the possibility to mask 1) printing of stats selectively on q 
> files by adding the {{-- MASK_STATS}} option, and 2) printing of data size 
> stats selectively on q files by adding the {{-- MASK_DATA_SIZE}} option.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (HIVE-19575) TestAutoPurgeTables seems flaky

2018-05-16 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19575?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jesus Camacho Rodriguez resolved HIVE-19575.

   Resolution: Fixed
Fix Version/s: 3.1.0

Pushed to master, thanks [~prasanth_j]

> TestAutoPurgeTables seems flaky
> ---
>
> Key: HIVE-19575
> URL: https://issues.apache.org/jira/browse/HIVE-19575
> Project: Hive
>  Issue Type: Bug
>  Components: Test
>Affects Versions: 3.1.0
>Reporter: Prasanth Jayachandran
>Assignee: Prasanth Jayachandran
>Priority: Major
> Fix For: 3.1.0
>
> Attachments: HIVE-19575.1.patch
>
>
> I cannot reproduce the flakiness locally. Maybe we can retry this flaky test 
> using RetryTestRunner. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19500) Prevent multiple selectivity estimations for the same variable in conjuctions

2018-05-16 Thread Nita Dembla (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16478001#comment-16478001
 ] 

Nita Dembla commented on HIVE-19500:


I didn't have a good pass rate on TPCDS with 2nd patch here. Though the change 
fixes query74 but causes OOM's in other queries.

> Prevent multiple selectivity estimations for the same variable in conjuctions
> -
>
> Key: HIVE-19500
> URL: https://issues.apache.org/jira/browse/HIVE-19500
> Project: Hive
>  Issue Type: Sub-task
>Affects Versions: 3.0.0, 3.1.0
>Reporter: Zoltan Haindrich
>Assignee: Zoltan Haindrich
>Priority: Major
> Attachments: HIVE-19500.01.patch, HIVE-19500.02.patch
>
>
> see HIVE-19097 for problem description
> for filters like: {{(d_year in (2001,2002) and d_year = 2001)}} the current 
> estimation is around {{(1/NDV)**2}} (iff column stats are available) 
> actually the source of the problem was a small typo in HIVE-17465 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19417) Modify metastore to have/access persistent tables for stats

2018-05-16 Thread Steve Yeom (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16477994#comment-16477994
 ] 

Steve Yeom commented on HIVE-19417:
---

While I was talking with Eugene, I realized UPD_PARTTXNS table is not needed. 
I already made relevant changes for the other patch but will generate a new one 
for this jira.

> Modify metastore to have/access persistent tables for stats
> ---
>
> Key: HIVE-19417
> URL: https://issues.apache.org/jira/browse/HIVE-19417
> Project: Hive
>  Issue Type: Sub-task
>  Components: Hive
>Affects Versions: 3.0.0
>Reporter: Steve Yeom
>Assignee: Steve Yeom
>Priority: Major
> Attachments: HIVE-19417.01.patch, HIVE-19417.02.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Eric Wohlstadter (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16477982#comment-16477982
 ] 

Eric Wohlstadter commented on HIVE-19308:
-

https://reviews.apache.org/r/67159/

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-19308) Provide an Arrow stream reader for external LLAP clients

2018-05-16 Thread Jason Dere (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-19308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16477971#comment-16477971
 ] 

Jason Dere commented on HIVE-19308:
---

Can you create an RB link?

> Provide an Arrow stream reader for external LLAP clients 
> -
>
> Key: HIVE-19308
> URL: https://issues.apache.org/jira/browse/HIVE-19308
> Project: Hive
>  Issue Type: Task
>  Components: llap
>Reporter: Eric Wohlstadter
>Assignee: Eric Wohlstadter
>Priority: Major
> Attachments: HIVE-19308.1.patch, HIVE-19308.2.patch
>
>
> This is a sub-class of LlapBaseRecordReader that wraps the socket inputStream 
> and produces Arrow batches for an external client.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >