[ 
https://issues.apache.org/jira/browse/LIVY-590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16824936#comment-16824936
 ] 

Marco Gaido commented on LIVY-590:
----------------------------------

Hi [~tanakahda], thanks for reporting this. Livy works with PRs and not with 
patches (please see 
[https://livy.incubator.apache.org/community/#contributing]). May you please 
create a PR? We can discuss the details of the issue there.

Moreover, it would be great if you could explain in the description of the PR 
you're going to open why this is happening on your env, but the CI and also 
other environments don't have this issue.

Thanks.

> ClassNotFoundException: javax.ws.rs.ext.MessageBodyReader on Livy 0.6.0
> -----------------------------------------------------------------------
>
>                 Key: LIVY-590
>                 URL: https://issues.apache.org/jira/browse/LIVY-590
>             Project: Livy
>          Issue Type: Bug
>          Components: Server
>    Affects Versions: 0.6.0
>            Reporter: Aki Tanaka
>            Priority: Major
>         Attachments: LIVY-590.patch
>
>
> After I upgraded Livy to 0.6.0-incubating, running Spark job using Livy 
> started failing. The details of the problem are below.
> 1. When I start Livy server, following message is logged:
> {quote}19/04/18 23:13:35 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
>  java.lang.NoClassDefFoundError: javax/ws/rs/ext/MessageBodyReader
>  at java.lang.ClassLoader.defineClass1(Native Method)
>  at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>  ..
>  Caused by: java.lang.ClassNotFoundException: 
> javax.ws.rs.ext.MessageBodyReader
>  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>  ... 50 more
> {quote}
>  
> 2. When I submit a Spark job using Livy, the job state stuck on "starting". 
> Also Livy cannot get the job's appId.
> {quote}$ curl [http://10.10.144.20:8998/batches]
> \{ "from": 0, "total": 1, "sessions": [ \{ "id": 0, "name": null, "state": 
> "starting", "appId": null, "appInfo": \{ "driverLogUrl": null, "sparkUiUrl": 
> null \}, "log": [ "19/04/18 20:28:58 INFO MemoryStore: MemoryStore cleared", 
> "19/04/18 20:28:58 INFO BlockManager: BlockManager stopped", "19/04/18 
> 20:28:58 INFO BlockManagerMaster: BlockManagerMaster stopped", "19/04/18 
> 20:28:58 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!", "19/04/18 20:28:58 INFO SparkContext: 
> Successfully stopped SparkContext", "19/04/18 20:28:58 INFO 
> ShutdownHookManager: Shutdown hook called", "19/04/18 20:28:58 INFO 
> ShutdownHookManager: Deleting directory 
> /mnt/tmp/spark-b8039adb-f3df-4526-8123-1bb2aee6ed7c", "19/04/18 20:28:58 INFO 
> ShutdownHookManager: Deleting directory 
> /mnt/tmp/spark-b48219fd-2607-4a7d-95bd-538c89f90ebb", "\nstderr: ", "\nYARN 
> Diagnostics: " ] \} ] \}
> {quote}
>  
> This is caused because Livy package does not have jersey-core jar file. This 
> change was introduced by LIVY-502
> I think Livy package should have the jersey-core jar file. Modifying 
> server/pom.xml (I attached the diff to this JIRA) was able to run Spark jobs 
> without this error. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to