Indeed. That's nice.
Thanks!
yotto
From: Matei Zaharia [matei.zaha...@gmail.com]
Sent: Wednesday, November 26, 2014 6:11 PM
To: Yotto Koga
Cc: Sean Owen; user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
Inste
~/spark/sbin/start-all.sh
>
>
>
> From: Sean Owen [so...@cloudera.com]
> Sent: Wednesday, November 26, 2014 12:14 AM
> To: Yotto Koga
> Cc: user@spark.apache.org
> Subject: Re: configure to run multiple tasks on a core
>
> What
-all.sh
From: Sean Owen [so...@cloudera.com]
Sent: Wednesday, November 26, 2014 12:14 AM
To: Yotto Koga
Cc: user@spark.apache.org
Subject: Re: configure to run multiple tasks on a core
What about running, say, 2 executors per machine, each of which thinks
it
ed to go the other way than what I am
> looking for.
>
> Is there a way where I can specify multiple tasks per core rather than
> multiple cores per task?
>
> thanks for any help.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.n
ooking for.
Is there a way where I can specify multiple tasks per core rather than
multiple cores per task?
thanks for any help.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/configure-to-run-multiple-tasks-on-a-core-tp19834.html
Sent from the Apache Spark Us