Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Hyukjin Kwon
Or maybe in https://github.com/apache/spark/blob/master/dev/run-tests#L23 On 29 Jul 2017 11:16 am, "Hyukjin Kwon" wrote: I am sorry for saying just based on my wild guess because I have no way to check and take a look into Jenkins but I think we might have to set the explicit Python version in h

Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Hyukjin Kwon
I am sorry for saying just based on my wild guess because I have no way to check and take a look into Jenkins but I think we might have to set the explicit Python version in https://github.com/apache/spark/blob/master/dev/run-tests-jenkins#L29 I guess we set the explicit Python version for running

[no subject]

2017-07-28 Thread Hao Chen
-- Hao

Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Dong Joon Hyun
I saw that error in the latest branch-2.1 build failure, too. https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-2.1-test-sbt-hadoop-2.7/579/console But, the code was written in Jan 2016. Didn’t we run it on Python 2.6 without any problem? ee74498de37 (

Re: Interested in contributing to spark eco

2017-07-28 Thread Reynold Xin
Shashi, Welcome! There are a lot of ways you can help contribute. There is a page documenting some of them: http://spark.apache.org/contributing.html On Fri, Jul 28, 2017 at 1:35 PM, Shashi Dongur wrote: > Hello All, > > I am looking for ways to contribute to Spark repo. I want to start with >

Interested in contributing to spark eco

2017-07-28 Thread Shashi Dongur
Hello All, I am looking for ways to contribute to Spark repo. I want to start with helping on running tests and improving documentation where needed. Please let me know how I can find avenues to help. How can I spot users who require assistance with testing? Or gathering documentation for any new

Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Hyukjin Kwon
Yes, that's my guess just given information here without a close look. On 28 Jul 2017 11:03 pm, "Sean Owen" wrote: I see, does that suggest that a machine has 2.6, when it should use 2.7? On Fri, Jul 28, 2017 at 2:58 PM Hyukjin Kwon wrote: > That looks appearently due to dict comprehension wh

Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Sean Owen
I see, does that suggest that a machine has 2.6, when it should use 2.7? On Fri, Jul 28, 2017 at 2:58 PM Hyukjin Kwon wrote: > That looks appearently due to dict comprehension which is, IIRC, not > allowed in Python 2.6.x. I checked the release note for sure before - > https://issues.apache.org/

Re: Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Hyukjin Kwon
That looks appearently due to dict comprehension which is, IIRC, not allowed in Python 2.6.x. I checked the release note for sure before - https://issues.apache.org/jira/browse/SPARK-20149 On 28 Jul 2017 9:56 pm, "Sean Owen" wrote: > File "./dev/run-tests.py", line 124 > {m: set(m.dependen

Tests failing with run-tests.py SyntaxError

2017-07-28 Thread Sean Owen
File "./dev/run-tests.py", line 124 {m: set(m.dependencies).intersection(modules_to_test) for m in modules_to_test}, sort=True) ^ SyntaxError: invalid syntax It seems like tests are failing intermittently with this type of error, w

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

2017-07-28 Thread Chetan Khatri
I think it will be same, but let me try that FYR - https://issues.apache.org/jira/browse/SPARK-19881 On Fri, Jul 28, 2017 at 4:44 PM, ayan guha wrote: > Try running spark.sql("set yourconf=val") > > On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri > wrote: > >> Jorn, Both are same. >> >> On Fri,

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

2017-07-28 Thread Chetan Khatri
Jorn, Both are same. On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke wrote: > Try sparksession.conf().set > > On 28. Jul 2017, at 12:19, Chetan Khatri > wrote: > > Hey Dev/ USer, > > I am working with Spark 2.0.1 and with dynamic partitioning with Hive > facing below issue: > > org.apache.hadoop.h

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

2017-07-28 Thread Jörn Franke
Try sparksession.conf().set > On 28. Jul 2017, at 12:19, Chetan Khatri wrote: > > Hey Dev/ USer, > > I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing > below issue: > > org.apache.hadoop.hive.ql.metadata.HiveException: > Number of dynamic partitions created is 1344

Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

2017-07-28 Thread Chetan Khatri
Hey Dev/ USer, I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue: org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1344, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1