i have added  SPARK_JAVA_OPTS+="-Dspark.
default.parallelism=40 "  in shark-env.sh,
but i find there are only10 tasks on the cluster and 2 tasks each machine.


2014-05-22 18:07 GMT+08:00 qingyang li <liqingyang1...@gmail.com>:

> i have added  SPARK_JAVA_OPTS+="-Dspark.default.parallelism=40 "  in
> shark-env.sh
>
>
> 2014-05-22 17:50 GMT+08:00 qingyang li <liqingyang1...@gmail.com>:
>
> i am using tachyon as storage system and using to shark to query a table
>> which is a bigtable, i have 5 machines as a spark cluster, there are 4
>> cores on each machine .
>> My question is:
>> 1. how to set task number on each core?
>> 2. where to see how many partitions of one RDD?
>>
>
>

Reply via email to