Hi,
Can you please share how you are assigning cpu core & tell us spark version and 
language you are using?
//Palash

Sent from Yahoo Mail on Android 
 
  On Wed, 18 Jan, 2017 at 10:16 pm, Saliya Ekanayake<esal...@gmail.com> wrote:  
 Thank you, for the quick response. No, this is not Spark SQL. I am running the 
built-in PageRank.
On Wed, Jan 18, 2017 at 10:33 AM, <jasbir.s...@accenture.com> wrote:


Are you talking here of Spark SQL ?

If yes,spark.sql.shuffle.partitions needs to be changed.

 

From: Saliya Ekanayake [mailto:esal...@gmail.com]
Sent: Wednesday, January 18, 2017 8:56 PM
To: User <user@spark.apache.org>
Subject: Spark #cores

 

Hi,

 

I am running a Spark application setting the number of executor cores 1 and a 
default parallelism of 32 over 8 physical nodes. 

 

The web UI shows it's running on 200 cores. I can't relate this number to the 
parameters I've used. How can I control the parallelism in a more deterministic 
way?

 

Thank you,

Saliya

 

-- 

Saliya Ekanayake, Ph.D

Applied Computer Scientist

Network Dynamics and Simulation Science Laboratory (NDSSL)

Virginia Tech, Blacksburg

 


This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy. 
______________________________ ______________________________ 
__________________________

www.accenture.com




-- 
Saliya Ekanayake, Ph.DApplied Computer ScientistNetwork Dynamics and Simulation 
Science Laboratory (NDSSL)Virginia Tech, Blacksburg

  

Reply via email to