> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-degree-of-parallelism-in-Spark-SQL-tp26996p27031.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
ur-apache-spark-jobs-part-2/
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-degree-of-parallelism-in-Spark-SQL-tp26996p27031.html
Sent from the Apache Spark User List mailing list archiv
2016 at 11:42 PM Ted Yu wrote:
>
>> Looks like an equal sign is missing between partitions and 200.
>>
>> On Sat, May 21, 2016 at 8:31 PM, SRK wrote:
>>
>>> Hi,
>>>
>>> How to set the degree of parallelism in Spark SQL? I am using the
>&
between partitions and 200.
>
> On Sat, May 21, 2016 at 8:31 PM, SRK wrote:
>
>> Hi,
>>
>> How to set the degree of parallelism in Spark SQL? I am using the
>> following
>> but it somehow seems to allocate only two executors at a time.
>>
>> sqlContext.s
Looks like an equal sign is missing between partitions and 200.
On Sat, May 21, 2016 at 8:31 PM, SRK wrote:
> Hi,
>
> How to set the degree of parallelism in Spark SQL? I am using the following
> but it somehow seems to allocate only two executors at a time.
>
> sq
Hi,
How to set the degree of parallelism in Spark SQL? I am using the following
but it somehow seems to allocate only two executors at a time.
sqlContext.sql(" set spark.sql.shuffle.partitions 200 ")
Thanks,
Swetha
--
View this message in context:
http://apache-spark-user-li