Patrick: It looks to me like this configures the cluster before startup.
The setting that I want to change is the amount of memory available to
each task (by default it's 512m). It appears that this is a property of
the job itself rather than the cluster.

Josh: I'm not sure about getting the latest version from Github because
I'm new to Spark. I didn't even manage to build the package from sources
and had to download the binaries.

Thanks,
Michal

> A recent pull request added a classmethod to PySpark's SparkContext that 
> allows you to configure the Java system properties from Python:
> 
> https://github.com/apache/incubator-spark/pull/97
> 
> 
> On Wed, Nov 20, 2013 at 10:34 AM, Patrick Wendell <[email protected]> wrote:
> 
>     You can add java options in SPARK_JAVA_OPTS inside of conf/spark-env.sh
> 
>     
> http://spark.incubator.apache.org/docs/latest/python-programming-guide.html#installing-and-configuring-pyspark
> 
>     - Patrick
> 
>     On Wed, Nov 20, 2013 at 8:52 AM, Michal Romaniuk
>     <[email protected]> wrote:
>     > The info about configuration options is available at the link below, but
>     > this seems to only work with Java. How can those options be set from 
> Python?
>     >
>     > 
> http://spark.incubator.apache.org/docs/latest/configuration.html#system-properties
>     >
>     > Thanks,
>     > Michal





Reply via email to