bq. I'm reading a 4.3 GB file

The contents of the file can be held in one executor.

Can you try files with much larger size ?

Cheers

On Sun, Jan 24, 2016 at 12:11 PM, jimitkr <ji...@softpath.net> wrote:

> Hi All,
>
> I have a machine with the following configuration:
> 32 GB RAM
> 500 GB HDD
> 8 CPUs
>
> Following are the parameters i'm starting my Spark context with:
>
> val conf = new
>
> SparkConf().setAppName("MasterApp").setMaster("local[1]").set("spark.executor.memory",
> "20g")
>
> I'm reading a 4.3 GB file and counting the no. of characters in it.
>
> When i run my program with:
> local[1], the count is returned in 1.8 minutes
> local[8], the count is returned in 4.2 minutes
>
> Both the times, Spark uses Storage Memory of 3.7 GB.
>
> Why would my program take more time with local[8]?
>
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-master-takes-more-time-with-local-8-than-local-1-tp26052.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to