[
https://issues.apache.org/jira/browse/HIVE-14901?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15877374#comment-15877374
]
Hive QA commented on HIVE-14901:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12853818/HIVE-14901.4.patch
{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 12 failed/errored test(s), 10253 tests
executed
*Failed tests:*
{noformat}
TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out)
(batchId=235)
org.apache.hadoop.hive.cli.TestEncryptedHDFSCliDriver.testCliDriver[encryption_join_with_different_encryption_keys]
(batchId=159)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14]
(batchId=223)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23]
(batchId=223)
org.apache.hive.jdbc.authorization.TestJdbcMetadataApiAuth.org.apache.hive.jdbc.authorization.TestJdbcMetadataApiAuth
(batchId=218)
org.apache.hive.jdbc.authorization.TestJdbcWithSQLAuthUDFBlacklist.testBlackListedUdfUsage
(batchId=217)
org.apache.hive.jdbc.authorization.TestJdbcWithSQLAuthorization.testAllowedCommands
(batchId=218)
org.apache.hive.jdbc.authorization.TestJdbcWithSQLAuthorization.testAuthorization1
(batchId=218)
org.apache.hive.jdbc.authorization.TestJdbcWithSQLAuthorization.testBlackListedUdfUsage
(batchId=218)
org.apache.hive.jdbc.authorization.TestJdbcWithSQLAuthorization.testConfigWhiteList
(batchId=218)
org.apache.hive.minikdc.TestJdbcWithMiniKdcSQLAuthBinary.testAuthorization1
(batchId=229)
org.apache.hive.minikdc.TestJdbcWithMiniKdcSQLAuthHttp.testAuthorization1
(batchId=229)
{noformat}
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3680/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3680/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3680/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 12 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12853818 - PreCommit-HIVE-Build
> HiveServer2: Use user supplied fetch size to determine #rows serialized in
> tasks
> --------------------------------------------------------------------------------
>
> Key: HIVE-14901
> URL: https://issues.apache.org/jira/browse/HIVE-14901
> Project: Hive
> Issue Type: Sub-task
> Components: HiveServer2, JDBC, ODBC
> Affects Versions: 2.1.0
> Reporter: Vaibhav Gumashta
> Assignee: Norris Lee
> Attachments: HIVE-14901.1.patch, HIVE-14901.2.patch,
> HIVE-14901.3.patch, HIVE-14901.4.patch, HIVE-14901.patch
>
>
> Currently, we use {{hive.server2.thrift.resultset.max.fetch.size}} to decide
> the max number of rows that we write in tasks. However, we should ideally use
> the user supplied value (which can be extracted from the
> ThriftCLIService.FetchResults' request parameter) to decide how many rows to
> serialize in a blob in the tasks. We should however use
> {{hive.server2.thrift.resultset.max.fetch.size}} to have an upper bound on
> it, so that we don't go OOM in tasks and HS2.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)