Indeed. That's nice.
Thanks!
yotto
From: Matei Zaharia [matei.zaha...@gmail.com]
Sent: Wednesday, November 26, 2014 6:11 PM
To: Yotto Koga
Cc: Sean Owen; user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
Inste
~/spark/sbin/start-all.sh
>
>
>
> From: Sean Owen [so...@cloudera.com]
> Sent: Wednesday, November 26, 2014 12:14 AM
> To: Yotto Koga
> Cc: user@spark.apache.org
> Subject: Re: configure to run multiple tasks on a core
>
> What
-all.sh
From: Sean Owen [so...@cloudera.com]
Sent: Wednesday, November 26, 2014 12:14 AM
To: Yotto Koga
Cc: user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
What about running, say, 2 executors per machine, each of which thinks
it
What about running, say, 2 executors per machine, each of which thinks
it should use all cores?
You can also multi-thread your map function manually, directly, within
your code, with careful use of a java.util.concurrent.Executor
On Wed, Nov 26, 2014 at 6:57 AM, yotto wrote:
> I'm running a spar