Re: [VOTE] SPARK 3.0.0-preview2 (RC2)

2019-12-16 Thread Yuming Wang
Please go to td28549 to vote, this voting link is incorrect. -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

[VOTE] SPARK 3.0.0-preview2 (RC2)

2019-12-16 Thread Yuming Wang
Please vote on releasing the following candidate as Apache Spark version 3.0.0-preview2. The vote is open until December 20 PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Release this package as Apache Spark 3.0.0-preview2 [ ] -1 Do not release this

[VOTE] SPARK 3.0.0-preview2 (RC2)

2019-12-16 Thread Yuming Wang
Please vote on releasing the following candidate as Apache Spark version 3.0 .0-preview2. The vote is open until December 20 PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Release this package as Apache Spark 3.0.0-preview2 [ ] -1 Do not release this

Re: Running Spark through a debugger

2019-12-16 Thread Sean Owen
I just make a new test suite or something, set breakpoints, and execute it in IJ. That generally works fine. You may need to set the run configuration to have the right working dir (Spark project root), and set the right system property to say 'this is running in a test' in some cases. What are

Running Spark through a debugger

2019-12-16 Thread Nicholas Chammas
I normally stick to the Python parts of Spark, but I am interested in walking through the DSv2 code and understanding how it works. I tried following the "IDE Setup" section of the developer tools page, but quickly hit several problems loading the

Re: Do we need to finally update Guava?

2019-12-16 Thread Sean Owen
PS you are correct; with Guava 27 and my recent changes, and Hadoop 3.2.1 + Hive 2.3, I still see ... *** RUN ABORTED *** java.lang.IllegalAccessError: tried to access method com.google.common.collect.Iterators.emptyIterator()Lcom/google/common/collect/UnmodifiableIterator; from class

Re: Slower than usual on PRs

2019-12-16 Thread Apostolos N. Papadopoulos
Health comes first. The rest will follow. Wishes for fast recovery. a. On 16/12/19 22:22, Holden Karau wrote: Thanks everyone :) My doctor seems to think I’ll be out of commission till sometime in February but I can do some (slow) typing and stuff now so I’ll try and not get too far out of

Re: Slower than usual on PRs

2019-12-16 Thread Holden Karau
Thanks everyone :) My doctor seems to think I’ll be out of commission till sometime in February but I can do some (slow) typing and stuff now so I’ll try and not get too far out of the loop between my other appointments. On Mon, Dec 16, 2019 at 12:07 PM Bryan Cutler wrote: > Sorry to hear this

Re: Revisiting Python / pandas UDF (continues)

2019-12-16 Thread Bryan Cutler
Thanks for taking this on Hyukjin! I'm looking forward to the PRs and happy to help out where I can. Bryan On Wed, Dec 4, 2019 at 9:13 PM Hyukjin Kwon wrote: > Hi all, > > I would like to finish redesigning Pandas UDF ones in Spark 3.0. > If you guys don't have a minor concern in general about

Re: Slower than usual on PRs

2019-12-16 Thread Bryan Cutler
Sorry to hear this Holden! Hope you get well soon and take it easy!! On Tue, Dec 3, 2019 at 6:21 PM Hyukjin Kwon wrote: > Yeah, please take care of your heath first! > > 2019년 12월 3일 (화) 오후 1:32, Wenchen Fan 님이 작성: > >> Sorry to hear that. Hope you get better soon! >> >> On Tue, Dec 3, 2019 at

Re: Do we need to finally update Guava?

2019-12-16 Thread Sean Owen
Yeah that won't be the last problem I bet. Here's a proposal for just directly reducing exposure to Guava in Spark itself though: https://github.com/apache/spark/pull/26911 On Mon, Dec 16, 2019 at 11:36 AM Marcelo Vanzin wrote: > > Great that Hadoop has done it (which, btw, probably means that

Re: Do we need to finally update Guava?

2019-12-16 Thread Marcelo Vanzin
Great that Hadoop has done it (which, btw, probably means that Spark won't work with that version of Hadoop yet), but Hive also depends on Guava, and last time I tried, even Hive 3.x did not work with Guava 27. (Newer Hadoop versions also have a new artifact that shades a lot of dependencies,