[ 
https://issues.apache.org/jira/browse/HADOOP-18750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17730675#comment-17730675
 ] 

ASF GitHub Bot commented on HADOOP-18750:
-----------------------------------------

hadoop-yetus commented on PR #5695:
URL: https://github.com/apache/hadoop/pull/5695#issuecomment-1583191500

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 35s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  21m 25s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  21m  1s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 26s |  |  trunk passed with JDK 
Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1  |
   | +1 :green_heart: |  compile  |   0m 23s |  |  trunk passed with JDK 
Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09  |
   | +1 :green_heart: |  mvnsite  |   0m 50s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 49s |  |  trunk passed with JDK 
Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 42s |  |  trunk passed with JDK 
Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09  |
   | +1 :green_heart: |  shadedclient  |  21m 19s |  |  branch has no errors 
when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 29s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   3m  1s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 16s |  |  the patch passed with JDK 
Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1  |
   | +1 :green_heart: |  javac  |   0m 16s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 15s |  |  the patch passed with JDK 
Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09  |
   | +1 :green_heart: |  javac  |   0m 15s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 34s |  |  the patch passed  |
   | +1 :green_heart: |  shellcheck  |   0m  0s |  |  No new issues.  |
   | +1 :green_heart: |  javadoc  |   0m 28s |  |  the patch passed with JDK 
Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 30s |  |  the patch passed with JDK 
Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09  |
   | -1 :x: |  shadedclient  |  20m 43s |  |  patch has errors when building 
and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   0m 19s |  |  hadoop-client-api in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   0m 18s |  |  hadoop-client-check-invariants 
in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 38s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   |  96m 10s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.43 ServerAPI=1.43 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/5695 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs |
   | uname | Linux 53cf23effe2f 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 
19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / ef9defadf710531709f2a7a060d7373b3f792b5f |
   | Default Java | Private Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.19+7-post-Ubuntu-0ubuntu120.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_362-8u372-ga~us1-0ubuntu1~20.04-b09 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/testReport/ |
   | Max. process+thread count | 728 (vs. ulimit of 5500) |
   | modules | C: hadoop-client-modules/hadoop-client-api 
hadoop-client-modules/hadoop-client-check-invariants U: hadoop-client-modules |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5695/3/console |
   | versions | git=2.25.1 maven=3.6.3 shellcheck=0.7.0 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> Spark History Server 3.3.1 fails to starts with Hadoop 3.3.x
> ------------------------------------------------------------
>
>                 Key: HADOOP-18750
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18750
>             Project: Hadoop Common
>          Issue Type: Bug
>            Reporter: Aman Raj
>            Assignee: Kamal Sharma
>            Priority: Major
>              Labels: pull-request-available
>
> When Spark History Server tries to start with Hadoop 3.3.4 (Happens only in 
> Kerberos scenarios), it fails to do so with the following exception: 
> {code:java}
> 23/05/23 03:14:15 ERROR HistoryServer [main]: Failed to bind HistoryServer
> java.lang.IllegalStateException: class 
> org.apache.hadoop.security.authentication.server.AuthenticationFilter is not 
> a javax.servlet.Filter
>         at 
> org.sparkproject.jetty.servlet.FilterHolder.doStart(FilterHolder.java:103) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
>  ~[?:1.8.0_372]
>         at 
> java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742) 
> ~[?:1.8.0_372]
>         at 
> java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:647) 
> ~[?:1.8.0_372]
>         at 
> org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
>  ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at org.apache.spark.ui.WebUI.$anonfun$bind$3(WebUI.scala:154) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at org.apache.spark.ui.WebUI.$anonfun$bind$3$adapted(WebUI.scala:154) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) 
> ~[scala-library-2.12.15.jar:?]
>         at 
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) 
> ~[scala-library-2.12.15.jar:?]
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) 
> ~[scala-library-2.12.15.jar:?]
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:154) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.apache.spark.deploy.history.HistoryServer.bind(HistoryServer.scala:164) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:310) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT]
>         at 
> org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala) 
> ~[spark-core_2.12-3.3.1.5.1-SNAPSHOT.jar:3.3.1.5.1-SNAPSHOT] {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to