[ 
https://issues.apache.org/jira/browse/LIVY-483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gyorgy Gal updated LIVY-483:
----------------------------
    Fix Version/s: 0.10.0
                       (was: 0.9.0)

This issue has been moved to the 0.10.0 release as part of a bulk update. If 
you feel this is moved out inappropriately, feel free to provide justification 
and reset the Fix Version to 0.9.0.

>  set spark memory configuration by SparkR code not working
> ----------------------------------------------------------
>
>                 Key: LIVY-483
>                 URL: https://issues.apache.org/jira/browse/LIVY-483
>             Project: Livy
>          Issue Type: Bug
>          Components: Core
>    Affects Versions: 0.5.0
>         Environment: Livy version: 0.5 version latest 2008 release 
> Spark: 2.1.1
> JupyterNote book: 4.3.1 
> SparkMagic: 0.11.3
>            Reporter: joyjit das
>            Priority: Major
>             Fix For: 0.10.0
>
>         Attachments: Screenshot from 2018-07-12 10-53-12.png, Screenshot from 
> 2018-07-12 10-53-41.png, Screenshot from 2018-07-12 10-54-41.png, Screenshot 
> from 2018-07-12 10-54-59.png, Screenshot from 2018-07-12 10-57-04.png, 
> Screenshot from 2018-07-12 10-57-25.png
>
>          Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Hi,
>  
> When we are trying to allocate spark memory by sparkR from jupyter notebook 
> Spark always setting default config memory.We want to set spark memory 
> configuration by SparkR code but with current Livy it's not working.Same 
> things when trying with pySpark its working properly.I'm attaching both 
> screenshots which we have tried from pySpark and sparkR with this mail. 
> If you need further any information regarding this issue please let us know.
>  
> Livy version: 0.5 version latest 2008 release 
> Spark: 2.1.1
> JupyterNote book: *4.3.1* 
> SparkMagic: 0.11.3



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to