Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/9615#issuecomment-157539491
I think I have a different issue cropping up that won't be solved by just
not including the META-INF/services/javax.ws.rs files in the assembly.
For context, my application is pulling in the spark-* artifacts as maven
dependencies, and just spawning a Spark context in its JVM to connect to a
Spark cluster.
I ran the following experiment with my app:
1. Use a Spark assembly jar that merely excludes the javax.ws.rs files, as
@vanzin had done in the above investigations. Start the Spark standalone master
and worker daemons.
2. In my application, I pull in Jersey 2 and explicitly exclude Jersey 1
dependencies. I went to my application's "lib" directory and verified that
there were no jersey-1.X jars being loaded on my app's classpath and only
Jersey 2.x jars were being loaded.
3. My Spark context should connect to my cluster I'm running on localhost.
4. My application fails at starting the Spark context with the following
exception:
```
Caused by: java.lang.NoClassDefFoundError:
com/sun/jersey/spi/container/servlet/ServletContainer
at
org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:174)
~[spark-core_2.10-1.4.1-palantir3.jar:1.4.1
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
~[spark-core_2.10-1.4.1-palantir3.jar:1.4.1-palantir3]
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74)
~[spark-core_2.10-1.4.1-palantir3.jar:1.4.1-palantir3]
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]