Re: How to set the degree of parallelism in Spark SQL?

2016-05-26 Thread Mich Talebzadeh
> View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-degree-of-parallelism-in-Spark-SQL-tp26996p27031.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > >

Re: How to set the degree of parallelism in Spark SQL?

2016-05-26 Thread Ian
ur-apache-spark-jobs-part-2/ -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-degree-of-parallelism-in-Spark-SQL-tp26996p27031.html Sent from the Apache Spark User List mailing list archiv

Re: How to set the degree of parallelism in Spark SQL?

2016-05-23 Thread Xinh Huynh
2016 at 11:42 PM Ted Yu wrote: > >> Looks like an equal sign is missing between partitions and 200. >> >> On Sat, May 21, 2016 at 8:31 PM, SRK wrote: >> >>> Hi, >>> >>> How to set the degree of parallelism in Spark SQL? I am using the >&

Re: How to set the degree of parallelism in Spark SQL?

2016-05-23 Thread Mathieu Longtin
between partitions and 200. > > On Sat, May 21, 2016 at 8:31 PM, SRK wrote: > >> Hi, >> >> How to set the degree of parallelism in Spark SQL? I am using the >> following >> but it somehow seems to allocate only two executors at a time. >> >> sqlContext.s

Re: How to set the degree of parallelism in Spark SQL?

2016-05-21 Thread Ted Yu
Looks like an equal sign is missing between partitions and 200. On Sat, May 21, 2016 at 8:31 PM, SRK wrote: > Hi, > > How to set the degree of parallelism in Spark SQL? I am using the following > but it somehow seems to allocate only two executors at a time. > > sq

How to set the degree of parallelism in Spark SQL?

2016-05-21 Thread SRK
Hi, How to set the degree of parallelism in Spark SQL? I am using the following but it somehow seems to allocate only two executors at a time. sqlContext.sql(" set spark.sql.shuffle.partitions 200 ") Thanks, Swetha -- View this message in context: http://apache-spark-user-li