[
https://issues.apache.org/jira/browse/HIVE-27087?focusedWorklogId=850349&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-850349
]
ASF GitHub Bot logged work on HIVE-27087:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 10/Mar/23 16:38
Start Date: 10/Mar/23 16:38
Worklog Time Spent: 10m
Work Description: zabetak commented on PR #4067:
URL: https://github.com/apache/hive/pull/4067#issuecomment-1464071389
> I am running out of ideas here. Any idea why the replication factor seen
in the q.out is 1 instead of 3
@vihangk1 Did you debug the creation of the file locally to see which
codepath sets the replication factor to 1? Maybe there is some logic to adapt
the replication factor based on other things leading to the different behavior
when running in CI.
Issue Time Tracking
-------------------
Worklog Id: (was: 850349)
Time Spent: 3h 10m (was: 3h)
> Fix TestMiniSparkOnYarnCliDriver test failures on branch-3
> ----------------------------------------------------------
>
> Key: HIVE-27087
> URL: https://issues.apache.org/jira/browse/HIVE-27087
> Project: Hive
> Issue Type: Sub-task
> Reporter: Vihang Karajgaonkar
> Assignee: Vihang Karajgaonkar
> Priority: Major
> Labels: pull-request-available
> Time Spent: 3h 10m
> Remaining Estimate: 0h
>
> TestMiniSparkOnYarnCliDriver are failing with the error below
> [ERROR] 2023-02-16 14:13:08.991 [Driver] SparkContext - Error initializing
> SparkContext.
> java.lang.RuntimeException: java.lang.NoSuchFieldException:
> DEFAULT_TINY_CACHE_SIZE
> at
> org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131)
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118)
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.network.server.TransportServer.init(TransportServer.java:94)
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:73)
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.network.TransportContext.createServer(TransportContext.java:114)
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
> ~[scala-library-2.11.8.jar:?]
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:161)
> [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
> at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:536)
> [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_322]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_322]
> The root cause of the problem is that we upgrade the netty library from
> 4.1.17.Final to 4.1.69.Final. The upgraded library does not have
> `DEFAULT_TINY_CACHE_SIZE` field
> [here|https://github.com/netty/netty/blob/netty-4.1.51.Final/buffer/src/main/java/io/netty/buffer/PooledByteBufAllocator.java#L46]
> which was removed in 4.1.52.Final
--
This message was sent by Atlassian Jira
(v8.20.10#820010)