[ 
https://issues.apache.org/jira/browse/LIVY-339?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gyorgy Gal updated LIVY-339:
----------------------------
    Fix Version/s: 0.10.0
                       (was: 0.9.0)

This issue has been moved to the 0.10.0 release as part of a bulk update. If 
you feel this is moved out inappropriately, feel free to provide justification 
and reset the Fix Version to 0.9.0.

> Unable to start spark session if spark.jars.packages are set in spark config.
> -----------------------------------------------------------------------------
>
>                 Key: LIVY-339
>                 URL: https://issues.apache.org/jira/browse/LIVY-339
>             Project: Livy
>          Issue Type: Bug
>          Components: Core, REPL
>    Affects Versions: 0.4.0
>         Environment: Spark 2.1 running on Hadoop 2.8
>            Reporter: Chetan Kumar Bhatt
>            Priority: Major
>             Fix For: 0.10.0
>
>         Attachments: Error.png, Success.png
>
>
> I have added following configuration to spark-defaults.conf
> {code}
>  spark.jars.packages 
> com.amazonaws:aws-java-sdk:1.11.115,org.apache.hadoop:hadoop-aws:2.8.0
> {code}
> If I start spark-shell with this configuration, spark-shell loads these jars 
> and starts without any issue.
> But when I use Jupyter with Livy, I am getting error as shown in attached 
> error.png.
> Same Jupyter+Livy+Spark combo works well with default spark configuration as 
> shown in screenshot success.png



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to