pan3793 opened a new pull request, #7878:
URL: https://github.com/apache/hadoop/pull/7878

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   
   When trying to use the Hadoop trunk version client with Spark 4.0.0, 
`NoClassDefFoundError` was raised.
   
   ```
   Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/hadoop/shaded/javax/ws/rs/WebApplicationException
        at java.base/java.lang.ClassLoader.defineClass1(Native Method)
        at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:962)
        at 
java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:144)
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:776)
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:691)
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:620)
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:578)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:490)
        at 
org.apache.spark.deploy.yarn.YarnRMClient.getAmIpFilterParams(YarnRMClient.scala:109)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.addAmIpFilter(ApplicationMaster.scala:698)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:555)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:265)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:942)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:941)
        at 
java.base/jdk.internal.vm.ScopedValueContainer.callWithoutScope(ScopedValueContainer.java:162)
        at 
java.base/jdk.internal.vm.ScopedValueContainer.call(ScopedValueContainer.java:147)
        at java.base/java.lang.ScopedValue$Carrier.call(ScopedValue.java:419)
        at java.base/javax.security.auth.Subject.callAs(Subject.java:331)
        at org.apache.hadoop.util.SubjectUtil.callAs(SubjectUtil.java:134)
        at org.apache.hadoop.util.SubjectUtil.doAs(SubjectUtil.java:166)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:2039)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:941)
        at 
org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:973)
        at 
org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
   Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.shaded.javax.ws.rs.WebApplicationException
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:580)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:490)
        ... 24 more
   ```
   
   ### How was this patch tested?
   
   Hadoop Client Runtime 3.4.2 RC2
   <img width="407" height="459" alt="image" 
src="https://github.com/user-attachments/assets/9ba6eb0c-2d52-4034-8f97-01c73195d795";
 />
   
   Hadoop Client Runtime 3.5.0-SNAPSHOT trunk
   <img width="516" height="432" alt="image" 
src="https://github.com/user-attachments/assets/81115ab8-3f86-4336-9d95-fe61093b09d1";
 />
   
   Hadoop Client Runtime 3.5.0-SNAPSHOT HADOOP-19652
   <img width="509" height="433" alt="image" 
src="https://github.com/user-attachments/assets/eb08a2d8-6b52-45f4-b56e-4ea3a92e950a";
 />
   
   Tested by submitting a Spark application to a YARN cluster.
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to