Hello,

What do you mean by "app that uses 2 cores and 8G of RAM"?

Spark apps generally involve multiple processes. The command line
options you used affect only one of them (the driver). You may want to
take a look at similar configuration for executors. Also, check the
documentation: http://spark.apache.org/docs/latest/configuration.html


On Wed, Dec 10, 2014 at 11:59 AM, 9000revs <9000r...@gmail.com> wrote:
> I am using CDH5.1 and Spark 1.0.0.
>
> Trying to configure resources to be allocated to each application. How do I
> do this? For example, I would each app to use 2 cores and 8G of RAM. I have
> tried using the pyspark commandline paramaters for --driver-memory,
> --driver-cores and see no effect of those changes in the Spark Master web UI
> when the app is started.
>
> Is there anyway to do this from inside Cloudera Manager also?
>
> Thanks.
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-Standalone-mode-config-tp20609.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to