Hi Alex,It is working correctly when i create spark session using bin/pyspark.
I can create multiple sessions as well. command which enable this is
"--total-executor-cores 4". I think livy is not passing it to Spark?
We are using DCOS and livy doesn't work with spark mesos deployment, so had to
setup standalone spark cluster.  





On Fri, Jan 12, 2018 2:57 AM, Alex Bozarth [email protected]  wrote:
Hi Junaid,

>From my experience this is an issue with the Spark stand-alone cluster, which 
>is
why Livy is recommended to run with YARN instead, which should allocate
resources properly. @Jerry, you have seen more production uses of LIvy than me,
am I correct that Livy with a stand-alone Spark cluster can't handle multiple
sessions?


Alex Bozarth
Software Engineer
Spark Technology Center  
--------------------------------------------------------------------------------

E-mail:  [email protected]
GitHub: github.com/ajbozarth

505 Howard Street
San Francisco, CA 94105
United States


Junaid Nasir ---01/11/2018 02:22:55 AM---Hi everyone, I am using livy 0.4 with
Spark 2.1.0 standalone cluster.I can create sessions

From: Junaid Nasir <[email protected]>
To: [email protected]
Date: 01/11/2018 02:22 AM
Subject: Session taking all the available resources even with number of cores
specified



--------------------------------------------------------------------------------



Hi everyone,

I am using livy 0.4 with Spark 2.1.0 standalone cluster.
I can create sessions and run jobs. but 1 session takes up all the available
resources. I have tried setting up executorCores, numExecutors as well as
spark.total.executor.cores. this command works fine when running a session from
cmd line





./spark-2.1.0/bin/pyspark --master spark://master:7077  --executor-cores 2  --
num-executors 1  --total-executor-cores 4


Not using Mixmax yet?  
post request on livy:8998/session






{
"kind": "pyspark",
"proxyUser": "root",
"conf": {
"spark.cassandra.connection.host": "10.128.1.1,10.128.1.2,10.128.1.3",
"spark.executor.cores": 2,
"spark.total.executor.cores": 2,
"livy.spark.driver.cores": 2,
"livy.spark.executor.cores": 2,
"livy.spark.executor.instances": 1
},
"executorMemory": "1G",
"executorCores": 2,
"numExecutors": 1,
"driverCores": 1,
"driverMemory": "1G"
}


Not using Mixmax yet?  

Is there any configuration I can do to limit the cores, so that I can run
multiple sessions on same cluster?

Regards
Junaid

Reply via email to