[ 
https://issues.apache.org/jira/browse/HIVE-27087?focusedWorklogId=846034&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-846034
 ]

ASF GitHub Bot logged work on HIVE-27087:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 17/Feb/23 00:37
            Start Date: 17/Feb/23 00:37
    Worklog Time Spent: 10m 
      Work Description: vihangk1 opened a new pull request, #4067:
URL: https://github.com/apache/hive/pull/4067

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/Hive/HowToContribute
     2. Ensure that you have created an issue on the Hive project JIRA: 
https://issues.apache.org/jira/projects/HIVE/summary
     3. Ensure you have added or run the appropriate tests for your PR: 
     4. If the PR is unfinished, add '[WIP]' in your PR title, e.g., 
'[WIP]HIVE-XXXXX:  Your PR title ...'.
     5. Be sure to keep the PR description updated to reflect all changes.
     6. Please write your PR title to summarize what this PR proposes.
     7. If possible, provide a concise example to reproduce the issue for a 
faster review.
   
   -->
   
   ### What changes were proposed in this pull request?
   Downgrade the netty-all version to 4.1.51.Final to test out if 
TestMiniSparkOnYarnCliDriver test work.
   
   
   ### Why are the changes needed?
   Fix breaking tests
   
   
   ### Does this PR introduce _any_ user-facing change?
   NO
   
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some 
test cases that check the changes thoroughly including negative and positive 
cases if possible.
   If it was tested in a way different from regular unit tests, please clarify 
how you tested step by step, ideally copy and paste-able, so that other 
reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why 
it was difficult to add.
   -->
   




Issue Time Tracking
-------------------

            Worklog Id:     (was: 846034)
    Remaining Estimate: 0h
            Time Spent: 10m

> Fix TestMiniSparkOnYarnCliDriver test failures on branch-3
> ----------------------------------------------------------
>
>                 Key: HIVE-27087
>                 URL: https://issues.apache.org/jira/browse/HIVE-27087
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Vihang Karajgaonkar
>            Assignee: Vihang Karajgaonkar
>            Priority: Major
>          Time Spent: 10m
>  Remaining Estimate: 0h
>
> TestMiniSparkOnYarnCliDriver are failing with the error below
> [ERROR] 2023-02-16 14:13:08.991 [Driver] SparkContext - Error initializing 
> SparkContext.
> java.lang.RuntimeException: java.lang.NoSuchFieldException: 
> DEFAULT_TINY_CACHE_SIZE
> at 
> org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131)
>  ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118)
>  ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.network.server.TransportServer.init(TransportServer.java:94) 
> ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:73)
>  ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.network.TransportContext.createServer(TransportContext.java:114)
>  ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465)
>  ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464)
>  ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271)
>  ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) 
> ~[scala-library-2.11.8.jar:?]
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) 
> ~[spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) 
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:423) 
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at 
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) 
> [spark-core_2.11-2.3.0.jar:2.3.0]
> at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:161) 
> [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
> at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:536) 
> [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_322]
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> ~[?:1.8.0_322]
> The root cause of the problem is that we upgrade the netty library from 
> 4.1.17.Final to 4.1.69.Final. The upgraded library does not have 
> `DEFAULT_TINY_CACHE_SIZE` field 
> [here|https://github.com/netty/netty/blob/netty-4.1.51.Final/buffer/src/main/java/io/netty/buffer/PooledByteBufAllocator.java#L46]
>  which was removed in 4.1.52.Final



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to