Sorry I don't have bandwidth to support 2.12, I can help to review it if
someone can do this.

Thanks
Saisai

<santosh.dan...@ubs.com> 于2019年6月5日周三 上午10:09写道:

> Hi Saisai,
>
> I’m not familiar with Livy code.  We’re just using it for our Jupyter
> integration.
>
> I’m looking through PR for 2.11 migration that was done an year ago and it
> looks like it is mostly Pom changes.  If that’s not correct than I might
> need help to perform the upgrade.
>
> Do you have bandwidth to make this change?
>
>
>
>
> From: Saisai Shao <sai.sai.s...@gmail.com<mailto:sai.sai.s...@gmail.com>>
> Date: Tuesday, Jun 04, 2019, 8:56 PM
> To: user@livy.incubator.apache.org <user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>>
> Subject: [External] Re: Support for Livy with Scala 2.12
>
> If you're familiar with Livy code, I think the effort is not so big.
> According to my previous experience on Scala 2.10 support, some codes may
> need to be changed because of version incompatible for Scala.
>
> Thanks
> Saisai
>
> <santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com>> 于2019年6月4日周二
> 下午8:25写道:
> How much effort we need to put in to create 2.12 module? Is that just a
> change in POM files or code change is required?
>
> We have release planned for July to upgrade Jupyter and Livy to utilize
> spark 2.4.2.  This is blocking us from upgrade.
> From: Saisai Shao <sai.sai.s...@gmail.com<mailto:sai.sai.s...@gmail.com
> ><mailto:sai.sai.s...@gmail.com<mailto:sai.sai.s...@gmail.com>>>
> Date: Monday, Jun 03, 2019, 9:02 PM
> To: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org>
> <user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org
> ><mailto:user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>>>
> Subject: [External] Re: Support for Livy with Scala 2.12
>
> Like what we did before to support both Scala 2.10 and 2.11 in Livy, I
> think we should also have a new module to support 2.12.
>
> <santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com><mailto:
> santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com>>> 于2019年6月4日周二
> 上午7:40写道:
> Yes, the spark binary we downloaded is built with default Scala 2.12.  We
> want to use databricks delta which I think only support Scala 2.12.  So,
> I'm stuck with Scala 2.12.  Moreover, Spark community is going to
> decommission Scala 2.11 completely from Spark 3.0 release.  We might need
> to prepare Livy to support Scala 2.12 by default.
>
> From: Kevin Risden [mailto:kris...@apache.org<mailto:kris...@apache.org
> ><mailto:kris...@apache.org<mailto:kris...@apache.org>>]
> Sent: Monday, June 03, 2019 6:35 PM
> To: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org
> ><mailto:user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>>
> Subject: [External] Re: Support for Livy with Scala 2.12
>
> Looks like the issue might be Spark 2.4.2 only? From
> https://spark.apache.org/downloads.html, "Note that, Spark is pre-built
> with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12."
> So maybe you just got unlucky with using Spark 2.4.2?
>
> Kevin Risden
>
>
> On Mon, Jun 3, 2019 at 6:19 PM <santosh.dan...@ubs.com<mailto:
> santosh.dan...@ubs.com><mailto:santosh.dan...@ubs.com<mailto:
> santosh.dan...@ubs.com>>> wrote:
> Kevin,
>
> I'm using Livy 0.6.0.  The issues is related to not finding repl jars that
> support scala 2.12.  The error "requirement failed: Cannot find Livy REPL
> jars." is thrown because it couldn't find folder repl_2.12-jars under LIVY
> directory.
>
> I performed a test to make sure this issue is related to scala 2.12
> compatibility , I copied contents of repl_2.11-jars under Livy directory
> into new directory LIVY/repl_2.12-jars and this time I didn't get REPL jars
> exception it went ahead and created session but failed to start session due
> to rsc jars version incompatibility.
>
> LIVY Folder structure for error " requirement failed: Cannot find Livy
> REPL jars.""
>
> [/app/risk/ha02/livy]$ ls -ltr
> total 116
> -rwxr-xr-x 1 agriddev agriddev   160 Mar 19 14:39 NOTICE
> -rwxr-xr-x 1 agriddev agriddev 18665 Mar 19 14:39 LICENSE
> -rwxr-xr-x 1 agriddev agriddev   537 Mar 19 14:39 DISCLAIMER
> -rwxr-xr-x 1 agriddev agriddev 46355 Mar 19 14:42 THIRD-PARTY
> drwxr-xr-x 2 agriddev agriddev  4096 Mar 19 14:43 bin
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 repl_2.11-jars
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 rsc-jars
> drwxr-xr-x 2 agriddev agriddev 12288 Apr 14 22:37 jars
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37
> apache-livy-0.6.0-incubating-bin
> drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 17:37 conf
> drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:51 logs
>
> LIVY FOLDER STRUCTURE TO BYPASS "REQUIREMENT FAILED:CANNOT FIND LIVY REPL
> JARS"
>
> [/app/risk/ha02/livy]$ ls -ltr
> total 116
> -rwxr-xr-x 1 agriddev agriddev   160 Mar 19 14:39 NOTICE
> -rwxr-xr-x 1 agriddev agriddev 18665 Mar 19 14:39 LICENSE
> -rwxr-xr-x 1 agriddev agriddev   537 Mar 19 14:39 DISCLAIMER
> -rwxr-xr-x 1 agriddev agriddev 46355 Mar 19 14:42 THIRD-PARTY
> drwxr-xr-x 2 agriddev agriddev  4096 Mar 19 14:43 bin
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 repl_2.11-jars
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37 rsc-jars
> drwxr-xr-x 2 agriddev agriddev 12288 Apr 14 22:37 jars
> drwxr-xr-x 2 agriddev agriddev  4096 Apr 14 22:37
> apache-livy-0.6.0-incubating-bin
> drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 17:37 conf
> drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:50 repl_2.12-jars
> drwxr-xr-x 2 agriddev agriddev  4096 Jun  3 21:51 logs
>
>
>
> Error Information
>
> zip
> 19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO
> SecurityManager: Changing view acls to: agriddev
> 19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO
> SecurityManager: Changing modify acls to: agriddev
> 19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO
> SecurityManager: Changing view acls groups to:
> 19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO
> SecurityManager: Changing modify acls groups to:
> 19/06/03 21:52:00 INFO LineBufferedStream: 19/06/03 21:52:00 INFO
> SecurityManager: SecurityManager: authentication disabled; ui acls
> disabled; users  with view permissions: Set(agriddev); groups with view
> permissions: Set(); users  with modify permissions: Set(agriddev); groups
> with modify permissions: Set()
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client:
> Submitting application application_1559316432251_0172 to ResourceManager
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO
> YarnClientImpl: Submitted application application_1559316432251_0172
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client:
> Application report for application_1559316432251_0172 (state: ACCEPTED)
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO Client:
> 19/06/03 21:52:01 INFO LineBufferedStream:       client token: N/A
> 19/06/03 21:52:01 INFO LineBufferedStream:       diagnostics: [Mon Jun 03
> 21:52:01 +0000 2019] Application is Activated, waiting for resources to be
> assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition
> Resource = <memory:2150400, vCores:180> ; Queue's Absolute capacity = 100.0
> % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity
> = 100.0 % ;
> 19/06/03 21:52:01 INFO LineBufferedStream:       ApplicationMaster host:
> N/A
> 19/06/03 21:52:01 INFO LineBufferedStream:       ApplicationMaster RPC
> port: -1
> 19/06/03 21:52:01 INFO LineBufferedStream:       queue: default
> 19/06/03 21:52:01 INFO LineBufferedStream:       start time: 1559598721629
> 19/06/03 21:52:01 INFO LineBufferedStream:       final status: UNDEFINED
> 19/06/03 21:52:01 INFO LineBufferedStream:       tracking URL:
> http://xzur1315dap.zur.swissbank.com:8088/proxy/application_1559316432251_0172/
> 19/06/03
> <http://xzur1315dap.zur.swissbank.com:8088/proxy/application_1559316432251_0172/19/06/03>
> <
> http://xzur1315dap.zur.swissbank.com:8088/proxy/application_1559316432251_0172/19/06/03>
> 21:52:01 INFO LineBufferedStream:       user: agriddev
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO
> ShutdownHookManager: Shutdown hook called
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO
> ShutdownHookManager: Deleting directory
> /tmp/spark-74eee398-fced-4173-8682-b512a95adea6
> 19/06/03 21:52:01 INFO LineBufferedStream: 19/06/03 21:52:01 INFO
> ShutdownHookManager: Deleting directory
> /app/risk/ds2/nvme0n1/ds2_spark_cluster_integration/tmp/spark_tmp/spark-91e38128-8064-409b-b5de-3d012b6ad81d
> 19/06/03 21:52:08 WARN RSCClient: Client RPC channel closed unexpectedly.
> 19/06/03 21:52:08 WARN RSCClient: Error stopping RPC.
> io.netty.util.concurrent.BlockingOperationException:
> DefaultChannelPromise@21867c24(uncancellable)
>         at
> io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:394)
>         at
> io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157)
>         at
> io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:230)
>         at
> io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129)
>         at
> io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28)
>         at
> io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:336)
>         at
> io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117)
>         at
> io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28)
>         at org.apache.livy.rsc.rpc.Rpc.close(Rpc.java:310)
>         at org.apache.livy.rsc.RSCClient.stop(RSCClient.java:232)
>         at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:129)
>         at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:123)
>         at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108)
>         at
> io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:518)
>         at
> io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:492)
>         at
> io.netty.util.concurrent.DefaultPromise.notifyListenersWithStackOverFlowProtection(DefaultPromise.java:431)
>         at
> io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
>         at
> io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:108)
>         at
> io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)
>         at
> io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:995)
>         at
> io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:621)
>         at
> io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:599)
>         at
> io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:543)
>         at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71)
>         at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:564)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:505)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:419)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:391)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)
>         at java.lang.Thread.run(Thread.java:748)
> 19/06/03 21:52:08 INFO RSCClient: Failing pending job
> 4d7789d1-a460-48c0-85fe-63fa24bb95a8 due to shutdown.
> 19/06/03 21:52:08 INFO InteractiveSession: Stopping InteractiveSession 0...
> 19/06/03 21:52:08 INFO InteractiveSession: Failed to ping RSC driver for
> session 0. Killing application.
> 19/06/03 21:52:09 INFO YarnClientImpl: Killed application
> application_1559316432251_0172
> 19/06/03 21:52:09 INFO InteractiveSession: Stopped InteractiveSession 0.
> ^C
>
>
>
> From: Kevin Risden [mailto:kris...@apache.org<mailto:kris...@apache.org
> ><mailto:kris...@apache.org<mailto:kris...@apache.org>>]
> Sent: Monday, June 03, 2019 4:46 PM
> To: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org
> ><mailto:user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>>
> Cc: Dandey, Santosh
> Subject: [External] Re: Support for Livy with Scala 2.12
>
>
> "requirement failed: Cannot find Livy REPL jars."
>
>
> I didn't look where that error comes from, but my guess is that it looks
> like you don't have Livy pointing to the right location anymore where it
> can find Spark. Hopefully not sending you on a wild goose chase, but would
> check there first.
>
> Also you need to make sure you are Livy 0.6.0+ since Spark 2.4 support was
> only added to 0.6.0 and above.
>
> Kevin Risden
>
>
> On Mon, Jun 3, 2019 at 4:04 PM Pat Ferrel <p...@occamsmachete.com<mailto:
> p...@occamsmachete.com><mailto:p...@occamsmachete.com<mailto:
> p...@occamsmachete.com>>> wrote:
> Spark 2.4.x does not require scala 2.12, in fact is is marked as
> “experimental” here:
> https://spark.apache.org/releases/spark-release-2-4-0.html
>
> <https://spark.apache.org/releases/spark-release-2-4-0.html>
> Moving to a new scala version is often a pain, because the libs you use
> may not be upgraded and version matter *unlike typical Java updates). Scala
> creates JVM objects and names them as it pleases. Sometimes naming changes
> from version to version of Scala and this causes big problem in using mixed
> libs from different versions of Scala.
>
> I’m no expert in Livy, but imagine you may need to build against a newer
> Spark. But avoid Scala 2.12 for now.
>
> From: santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com><mailto:
> santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com>> <
> santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com>><mailto:
> santosh.dan...@ubs.com<mailto:santosh.dan...@ubs.com>>
> Reply: user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org><mailto:user@livy.incubator.apache.org
> <mailto:user@livy.incubator.apache.org>> <user@livy.incubator.apache.org
> <mailto:user@livy.incubator.apache.org>><mailto:
> user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org>>
> Date: June 3, 2019 at 12:51:20 PM
> To: user@livy.incubator.apache.org<mailto:user@livy.incubator.apache.org
> ><mailto:user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>> <user@livy.incubator.apache.org<mailto:
> user@livy.incubator.apache.org>><mailto:user@livy.incubator.apache.org
> <mailto:user@livy.incubator.apache.org>>
> Subject:  Support for Livy with Scala 2.12
>
> Hi,
>
> We have just upgraded our spark cluster version 2.3 to 2.4.2 and it broke
> Livy.  It's throwing exception "Cannot Find Livy REPL Jars".  Looks like I
> have to build Livy using Scala 2.12 version.
>
> Can anyone advise how to build Livy with Scala 2.12 with Maven? Will
> changing the scala version from 2.11 to 2.12 would build livy? Please
> advise.
>
>
>
>
> The code failed because of a fatal error:
>
>         Invalid status code '400' from http://localhost:8998/sessions
> with error payload: {"msg":"requirement failed: Cannot find Livy REPL
> jars."}.
>
> Thanks
> Santosh
>
> Please visit our website at
> http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html
> for important disclosures and information about our e-mail
> policies. For your protection, please do not transmit orders
> or instructions by e-mail or include account numbers, Social
> Security numbers, credit card numbers, passwords, or other
> personal information.
>
> Please visit our website at
> http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html
> for important disclosures and information about our e-mail
> policies. For your protection, please do not transmit orders
> or instructions by e-mail or include account numbers, Social
> Security numbers, credit card numbers, passwords, or other
> personal information.
>
> Please visit our website at
> http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html
> for important disclosures and information about our e-mail
> policies. For your protection, please do not transmit orders
> or instructions by e-mail or include account numbers, Social
> Security numbers, credit card numbers, passwords, or other
> personal information.
>

Reply via email to