Kevin,

I'm using Livy 0.6.0.  The issues is related to not finding repl jars that 
support scala 2.12.  The error "requirement failed: Cannot find Livy REPL 
jars." is thrown because it couldn't find folder repl_2.12-jars under LIVY 
directory.

I performed a test to make sure this issue is related to scala 2.12 
compatibility , I copied contents of repl_2.11-jars under Livy directory into 
new directory LIVY/repl_2.12-jars and this time I didn't get REPL jars 
exception it went ahead and created session but failed to start session due to 
rsc jars version incompatibility.

LIVY Folder structure for error " requirement failed: Cannot find Livy REPL 
jars.""

[/app/risk/ha02/livy]$ ls -ltr
total 116
-rwxr-xr-x 1 agriddev agriddev   160 Mar 19 14:39 NOTICE
-rwxr-xr-x 1 agriddev agriddev 18665 Mar 19 14:39 LICENSE
-rwxr-xr-x 1 agriddev agriddev   537 Mar 19 14:39 DISCLAIMER
-rwxr-xr-x 1 agriddev agriddev 46355 Mar 19 14:42 THIRD-PARTY
drwxr-xr-x 2 agriddev agriddev  4096 Mar 19 14:43 bin
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 repl_2.11-jars
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 rsc-jars
drwxr-xr-x 2 agriddev agriddev 12288 Apr 14 22:37 jars
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 
apache-livy-0.6.0-incubating-bin
drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 17:37 conf
drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:51 logs

LIVY FOLDER STRUCTURE TO BYPASS "REQUIREMENT FAILED:CANNOT FIND LIVY REPL JARS"

[/app/risk/ha02/livy]$ ls -ltr
total 116
-rwxr-xr-x 1 agriddev agriddev   160 Mar 19 14:39 NOTICE
-rwxr-xr-x 1 agriddev agriddev 18665 Mar 19 14:39 LICENSE
-rwxr-xr-x 1 agriddev agriddev   537 Mar 19 14:39 DISCLAIMER
-rwxr-xr-x 1 agriddev agriddev 46355 Mar 19 14:42 THIRD-PARTY
drwxr-xr-x 2 agriddev agriddev  4096 Mar 19 14:43 bin
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 repl_2.11-jars
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 rsc-jars
drwxr-xr-x 2 agriddev agriddev 12288 Apr 14 22:37 jars
drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 
apache-livy-0.6.0-incubating-bin
drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 17:37 conf
drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:50 repl_2.12-jars
drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:51 logs



Error Information

zip
19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO 
SecurityManager: Changing view acls to: agriddev
19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO 
SecurityManager: Changing modify acls to: agriddev
19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO 
SecurityManager: Changing view acls groups to:
19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO 
SecurityManager: Changing modify acls groups to:
19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO 
SecurityManager: SecurityManager: authentication disabled; ui acls disabled; 
users  with view permissions: Set(agriddev); groups with view permissions: 
Set(); users  with modify permissions: Set(agriddev); groups with modify 
permissions: Set()
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client: 
Submitting application application_1559316432251_0172 to ResourceManager
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO 
YarnClientImpl: Submitted application application_1559316432251_0172
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client: 
Application report for application_1559316432251_0172 (state: ACCEPTED)
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client:
19/06/03 21:52:01 INFO LineBufferedStream:       client token: N/A
19/06/03 21:52:01 INFO LineBufferedStream:       diagnostics: [Mon Jun 03 
21:52:01 +0000 2019] Application is Activated, waiting for resources to be 
assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition 
Resource = <memory:2150400, vCores:180> ; Queue's Absolute capacity = 100.0 % ; 
Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 
% ;
19/06/03 21:52:01 INFO LineBufferedStream:       ApplicationMaster host: N/A
19/06/03 21:52:01 INFO LineBufferedStream:       ApplicationMaster RPC port: -1
19/06/03 21:52:01 INFO LineBufferedStream:       queue: default
19/06/03 21:52:01 INFO LineBufferedStream:       start time: 1559598721629
19/06/03 21:52:01 INFO LineBufferedStream:       final status: UNDEFINED
19/06/03 21:52:01 INFO LineBufferedStream:       tracking URL: 
http://xzur1315dap.zur.swissbank.com:8088/proxy/application_1559316432251_0172/
19/06/03 21:52:01 INFO LineBufferedStream:       user: agriddev
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO 
ShutdownHookManager: Shutdown hook called
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO 
ShutdownHookManager: Deleting directory 
/tmp/spark-74eee398-fced-4173-8682-b512a95adea6
19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO 
ShutdownHookManager: Deleting directory 
/app/risk/ds2/nvme0n1/ds2_spark_cluster_integration/tmp/spark_tmp/spark-91e38128-8064-409b-b5de-3d012b6ad81d
19/06/03 21:52:08 WARN RSCClient: Client RPC channel closed unexpectedly.
19/06/03 21:52:08 WARN RSCClient: Error stopping RPC.
io.netty.util.concurrent.BlockingOperationException: 
DefaultChannelPromise@21867c24(uncancellable)
        at 
io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:394)
        at 
io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157)
        at 
io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:230)
        at 
io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129)
        at 
io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28)
        at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:336)
        at 
io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117)
        at 
io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28)
        at org.apache.livy.rsc.rpc.Rpc.close(Rpc.java:310)
        at org.apache.livy.rsc.RSCClient.stop(RSCClient.java:232)
        at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:129)
        at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:123)
        at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:518)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:492)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListenersWithStackOverFlowProtection(DefaultPromise.java:431)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
        at 
io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:108)
        at 
io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)
        at 
io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:995)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:621)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:599)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:543)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:564)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:505)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:419)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:391)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)
        at java.lang.Thread.run(Thread.java:748)
19/06/03 21:52:08 INFO RSCClient: Failing pending job 
4d7789d1-a460-48c0-85fe-63fa24bb95a8 due to shutdown.
19/06/03 21:52:08 INFO InteractiveSession: Stopping InteractiveSession 0...
19/06/03 21:52:08 INFO InteractiveSession: Failed to ping RSC driver for 
session 0. Killing application.
19/06/03 21:52:09 INFO YarnClientImpl: Killed application 
application_1559316432251_0172
19/06/03 21:52:09 INFO InteractiveSession: Stopped InteractiveSession 0.
^C



From: Kevin Risden [mailto:kris...@apache.org]
Sent: Monday, June 03, 2019 4:46 PM
To: user@livy.incubator.apache.org
Cc: Dandey, Santosh
Subject: [External] Re: Support for Livy with Scala 2.12


"requirement failed: Cannot find Livy REPL jars."


I didn't look where that error comes from, but my guess is that it looks like 
you don't have Livy pointing to the right location anymore where it can find 
Spark. Hopefully not sending you on a wild goose chase, but would check there 
first.

Also you need to make sure you are Livy 0.6.0+ since Spark 2.4 support was only 
added to 0.6.0 and above.

Kevin Risden


On Mon, Jun 3, 2019 at 4:04 PM Pat Ferrel 
<p...@occamsmachete.com<mailto:p...@occamsmachete.com>> wrote:
Spark 2.4.x does not require scala 2.12, in fact is is marked as “experimental” 
here: https://spark.apache.org/releases/spark-release-2-4-0.html

<https://spark.apache.org/releases/spark-release-2-4-0.html>
Moving to a new scala version is often a pain, because the libs you use may not 
be upgraded and version matter *unlike typical Java updates). Scala creates JVM 
objects and names them as it pleases. Sometimes naming changes from version to 
version of Scala and this causes big problem in using mixed libs from different 
versions of Scala.

I’m no expert in Livy, but imagine you may need to build against a newer Spark. 
But avoid Scala 2.12 for now.

From: santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com> 
<santosh.dan...@ubs.com><mailto:santosh.dan...@ubs.com>
Reply: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org> 
<user@livy.incubator.apache.org><mailto:user@livy.incubator.apache.org>
Date: June 3, 2019 at 12:51:20 PM
To: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org> 
<user@livy.incubator.apache.org><mailto:user@livy.incubator.apache.org>
Subject:  Support for Livy with Scala 2.12


Hi,

We have just upgraded our spark cluster version 2.3 to 2.4.2 and it broke Livy. 
 It's throwing exception "Cannot Find Livy REPL Jars".  Looks like I have to 
build Livy using Scala 2.12 version.

Can anyone advise how to build Livy with Scala 2.12 with Maven? Will changing 
the scala version from 2.11 to 2.12 would build livy? Please advise.




The code failed because of a fatal error:

        Invalid status code '400' from http://localhost:8998/sessions with 
error payload: {"msg":"requirement failed: Cannot find Livy REPL jars."}.

Thanks
Santosh

Please visit our website at
http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html
for important disclosures and information about our e-mail
policies. For your protection, please do not transmit orders
or instructions by e-mail or include account numbers, Social
Security numbers, credit card numbers, passwords, or other
personal information.
Please visit our website at 
http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html 
for important disclosures and information about our e-mail 
policies. For your protection, please do not transmit orders 
or instructions by e-mail or include account numbers, Social 
Security numbers, credit card numbers, passwords, or other 
personal information.

Reply via email to