[
https://issues.apache.org/jira/browse/HIVE-19814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16611721#comment-16611721
]
Hive QA commented on HIVE-19814:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12939280/HIVE-19814.3.patch
{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 14937 tests
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.ql.exec.spark.TestSparkSessionTimeout.testMultiSparkSessionTimeout
(batchId=245)
{noformat}
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/13730/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/13730/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-13730/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 1 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12939280 - PreCommit-HIVE-Build
> RPC Server port is always random for spark
> ------------------------------------------
>
> Key: HIVE-19814
> URL: https://issues.apache.org/jira/browse/HIVE-19814
> Project: Hive
> Issue Type: Bug
> Components: Spark
> Affects Versions: 2.3.0, 3.0.0, 2.4.0, 4.0.0
> Reporter: bounkong khamphousone
> Assignee: Bharathkrishna Guruvayoor Murali
> Priority: Major
> Attachments: HIVE-19814.1.patch, HIVE-19814.2.patch,
> HIVE-19814.3.patch
>
>
> RPC server port is always a random one. In fact, the problem is in
> RpcConfiguration.HIVE_SPARK_RSC_CONFIGS which doesn't include
> SPARK_RPC_SERVER_PORT.
>
> I've found this issue while trying to make hive-on-spark running inside
> docker.
>
> HIVE_SPARK_RSC_CONFIGS is called by HiveSparkClientFactory.initiateSparkConf
> > SparkSessionManagerImpl.setup and the latter call
> SparkClientFactory.initialize(conf) which initialize the rpc server. This
> RPCServer is then used to create the sparkClient which use the rpc server
> port as --remote-port arg. Since initiateSparkConf ignore
> SPARK_RPC_SERVER_PORT, then it will always be a random port.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)