[
https://issues.apache.org/jira/browse/HIVE-14901?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15891455#comment-15891455
]
Hive QA commented on HIVE-14901:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12855509/HIVE-14901.8.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3882/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3882/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3882/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2017-03-02 01:49:10.215
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-3882/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2017-03-02 01:49:10.217
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at ba8de30 HIVE-14459: TestBeeLineDriver - migration and re-enable
(Peter Vary via Zoltan Haindrich reviewed by Vihang Karajgaonkar)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at ba8de30 HIVE-14459: TestBeeLineDriver - migration and re-enable
(Peter Vary via Zoltan Haindrich reviewed by Vihang Karajgaonkar)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2017-03-02 01:49:11.139
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: patch -p0
patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
Hunk #1 succeeded at 2619 (offset 24 lines).
Hunk #2 succeeded at 4234 (offset 49 lines).
patching file
itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcWithMiniHS2.java
patching file jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java
Hunk #1 succeeded at 639 (offset 2 lines).
Hunk #2 succeeded at 667 (offset 2 lines).
patching file
serde/src/java/org/apache/hadoop/hive/serde2/thrift/ThriftJDBCBinarySerDe.java
patching file service/src/java/org/apache/hive/service/cli/CLIService.java
Hunk #1 succeeded at 80 (offset 2 lines).
Hunk #2 succeeded at 504 with fuzz 1 (offset 44 lines).
patching file service/src/java/org/apache/hive/service/cli/CLIServiceClient.java
patching file
service/src/java/org/apache/hive/service/cli/session/HiveSession.java
patching file
service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java
patching file
service/src/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java
Hunk #1 succeeded at 317 (offset 8 lines).
Hunk #2 succeeded at 742 (offset 36 lines).
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q
-Dmaven.repo.local=/data/hiveptest/working/maven
ANTLR Parser Generator Version 3.5.2
Output file
/data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
DataNucleus Enhancer (version 4.1.6) for API "JDO"
DataNucleus Enhancer : Classpath
>> /usr/share/maven/boot/plexus-classworlds-2.x.jar
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MColumnDescriptor
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MStorageDescriptor
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction
ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog
ENHANCED (Persistable) :
org.apache.hadoop.hive.metastore.model.MNotificationNextId
DataNucleus Enhancer completed with success for 30 classes. Timings : input=156
ms, enhance=194 ms, total=350 ms. Consult the log for full details
ANTLR Parser Generator Version 3.5.2
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on
project hive-exec: Error resolving project artifact: Could not transfer
artifact org.pentaho:pentaho-aggdesigner-algorithm:pom:5.1.5-jhyde from/to
datanucleus (http://www.datanucleus.org/downloads/maven2): Connect to
www.datanucleus.org:80 [www.datanucleus.org/80.86.85.8] failed: Connection
timed out (Connection timed out) for project
org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hive-exec
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12855509 - PreCommit-HIVE-Build
> HiveServer2: Use user supplied fetch size to determine #rows serialized in
> tasks
> --------------------------------------------------------------------------------
>
> Key: HIVE-14901
> URL: https://issues.apache.org/jira/browse/HIVE-14901
> Project: Hive
> Issue Type: Sub-task
> Components: HiveServer2, JDBC, ODBC
> Affects Versions: 2.1.0
> Reporter: Vaibhav Gumashta
> Assignee: Norris Lee
> Attachments: HIVE-14901.1.patch, HIVE-14901.2.patch,
> HIVE-14901.3.patch, HIVE-14901.4.patch, HIVE-14901.5.patch,
> HIVE-14901.6.patch, HIVE-14901.7.patch, HIVE-14901.8.patch, HIVE-14901.patch
>
>
> Currently, we use {{hive.server2.thrift.resultset.max.fetch.size}} to decide
> the max number of rows that we write in tasks. However, we should ideally use
> the user supplied value (which can be extracted from the
> ThriftCLIService.FetchResults' request parameter) to decide how many rows to
> serialize in a blob in the tasks. We should however use
> {{hive.server2.thrift.resultset.max.fetch.size}} to have an upper bound on
> it, so that we don't go OOM in tasks and HS2.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)