Re: Spark.default.parallelism can not set reduce number

2016-05-20 Thread Ovidiu-Cristian MARCU
You can check org.apache.spark.sql.internal.SQLConf for other default settings as well. val SHUFFLE_PARTITIONS = SQLConfigBuilder("spark.sql.shuffle.partitions") .doc("The default number of partitions to use when shuffling data for joins or aggregations.") .intConf

Re: Spark.default.parallelism can not set reduce number

2016-05-20 Thread Takeshi Yamamuro
You need to use `spark.sql.shuffle.partitions`. // maropu On Fri, May 20, 2016 at 8:17 PM, ε–œδΉ‹ιƒŽ <251922...@qq.com> wrote: > Hi all. > I set Spark.default.parallelism equals 20 in spark-default.conf. And send > this file to all nodes. > But I found reduce number is still default value,200. >

Spark.default.parallelism can not set reduce number

2016-05-20 Thread ??????
Hi all. I set Spark.default.parallelism equals 20 in spark-default.conf. And send this file to all nodes. But I found reduce number is still default value,200. Does anyone else encouter this problem? can anyone give some advice? [Stage 9:>