unsubscribe
-all.sh
From: Sean Owen [so...@cloudera.com]
Sent: Wednesday, November 26, 2014 12:14 AM
To: Yotto Koga
Cc: user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
What about running, say, 2 executors per machine, each of which thinks
Indeed. That's nice.
Thanks!
yotto
From: Matei Zaharia [matei.zaha...@gmail.com]
Sent: Wednesday, November 26, 2014 6:11 PM
To: Yotto Koga
Cc: Sean Owen; user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
Instead
I'm running a spark-ec2 cluster.
I have a map task that calls a specialized C++ external app. The app doesn't
fully utilize the core as it needs to download/upload data as part of the
task. Looking at the worker nodes, it appears that there is one task with my
app running per core.
I'd like to