Re: [VOTE] Spark 2.3.0 (RC4)
+1 (non-binding) all the Hive-on-Spark tests are passing HIVE-18436 -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Re: [VOTE] Spark 2.3.0 (RC4)
> This RC has failed due to >>>>>>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>>>>>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>>>>>> follow >>>>>>> >> > up >>>>>>> >> > with an RC5 soon. >>>>>>> >> > >>>>>>> >> > On 20 February 2018 at 16:49, Ryan Blue >>>>>>> wrote: >>>>>>> >> >> >>>>>>> >> >> +1 >>>>>>> >> >> >>>>>>> >> >> Build & tests look fine, checked signature and checksums for >>>>>>> src >>>>>>> >> >> tarball. >>>>>>> >> >> >>>>>>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>>>>>> >> >> wrote: >>>>>>> >> >>> >>>>>>> >> >>> I'm -1 because of the UI regression >>>>>>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All >>>>>>> Jobs page >>>>>>> >> >>> may be >>>>>>> >> >>> too slow and cause "read timeout" when there are lots of jobs >>>>>>> and >>>>>>> >> >>> stages. >>>>>>> >> >>> This is one of the most important pages because when it's >>>>>>> broken, it's >>>>>>> >> >>> pretty hard to use Spark Web UI. >>>>>>> >> >>> >>>>>>> >> >>> >>>>>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>>>>> marcogaid...@gmail.com> >>>>>>> >> >>> wrote: >>>>>>> >> >>>> >>>>>>> >> >>>> +1 >>>>>>> >> >>>> >>>>>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon >>>>>> >: >>>>>>> >> >>>>> >>>>>>> >> >>>>> +1 too >>>>>>> >> >>>>> >>>>>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>>>>>> ues...@happy-camper.st>: >>>>>>> >> >>>>>> >>>>>>> >> >>>>>> +1 >>>>>>> >> >>>>>> >>>>>>> >> >>>>>> >>>>>>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>>>> >> >>>>>> >>>>>>> >> >>>>>> wrote: >>>>>>> >> >>>>>>> >>>>>>> >> >>>>>>> +1 >>>>>>> >> >>>>>>> >>>>>>> >> >>>>>>> >>>>>>> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>>>> >> >>>>>>>> >>>>>>> >> >>>>>>>> +1 >>>>>>> >> >>>>>>>> >>>>>>> >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>>>> >> >>>>>>>> >>>>>>> >> >>>>>>>> wrote: >>>>>>> >> >>>>>>>>> >>>>>>> >> >>>>>>>>> +1 >>>>>>> >> >>>>>>>>> >>>>>>> >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >>>>>>> >> >>>>>>>>> , wrote: >>>>>>> >> >>>>>>>>>> >>>>>>> >> >>>>>>>>>> this file shouldn't be included? >>>>>>> >> >>>>>>>>>> >>>>>>> >> >>>>>>>>>> http
Re: [VOTE] Spark 2.3.0 (RC4)
t; >> mailto:shixi...@databricks.com>> wrote: > >> >>> > >> >>> I'm -1 because of the UI regression > >> >>> https://issues.apache.org/jira/browse/SPARK-23470 > >> >>> <https://issues.apache.org/jira/browse/SPARK-23470> : the All Jobs page > >> >>> may be > >> >>> too slow and cause "read timeout" when there are lots of jobs and > >> >>> stages. > >> >>> This is one of the most important pages because when it's broken, it's > >> >>> pretty hard to use Spark Web UI. > >> >>> > >> >>> > >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido >> >>> <mailto:marcogaid...@gmail.com>> > >> >>> wrote: > >> >>>> > >> >>>> +1 > >> >>>> > >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon >> >>>> <mailto:gurwls...@gmail.com>>: > >> >>>>> > >> >>>>> +1 too > >> >>>>> > >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN >> >>>>> <mailto:ues...@happy-camper.st>>: > >> >>>>>> > >> >>>>>> +1 > >> >>>>>> > >> >>>>>> > >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang > >> >>>>>> mailto:jiangxb1...@gmail.com>> > >> >>>>>> wrote: > >> >>>>>>> > >> >>>>>>> +1 > >> >>>>>>> > >> >>>>>>> > >> >>>>>>> Wenchen Fan >> >>>>>>> <mailto:cloud0...@gmail.com>>于2018年2月20日 周二下午1:09写道: > >> >>>>>>>> > >> >>>>>>>> +1 > >> >>>>>>>> > >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin > >> >>>>>>>> mailto:r...@databricks.com>> > >> >>>>>>>> wrote: > >> >>>>>>>>> > >> >>>>>>>>> +1 > >> >>>>>>>>> > >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal > >> >>>>>>>>> mailto:sameer.a...@gmail.com>>, wrote: > >> >>>>>>>>>> > >> >>>>>>>>>> this file shouldn't be included? > >> >>>>>>>>>> > >> >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml > >> >>>>>>>>>> > >> >>>>>>>>>> <https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml> > >> >>>>>>>>> > >> >>>>>>>>> > >> >>>>>>>>> I've now deleted this file > >> >>>>>>>>> > >> >>>>>>>>>> From: Sameer Agarwal >> >>>>>>>>>> <mailto:sameer.a...@gmail.com>> > >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM > >> >>>>>>>>>> To: Sameer Agarwal > >> >>>>>>>>>> Cc: dev > >> >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) > >> >>>>>>>>>> > >> >>>>>>>>>> I'll start with a +1 once again. > >> >>>>>>>>>> > >> >>>>>>>>>> All blockers reported against RC3 have been resolved and the > >> >>>>>>>>>> builds are healthy. > >> >>>>>>>>>> > >> >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal > >> >>>>>>>>>> mailto:samee...@apache.org>> > >> >>>>>>>>>> wrote: > >> >>>>>>>>>>> > >> >>>>>>>>>>> Please vote on releasing the following candidate as Apache > >> >>>>>>
Re: [VOTE] Spark 2.3.0 (RC4)
;> samee...@apache.org> wrote: >>>>>>>> > Sure, please feel free to backport. >>>>>>>> > >>>>>>>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>>>>>>> wrote: >>>>>>>> >> >>>>>>>> >> Hey Sameer, >>>>>>>> >> >>>>>>>> >> Mind including https://github.com/apache/spark/pull/20643 >>>>>>>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only >>>>>>>> hit it >>>>>>>> >> with older shuffle services, but it's pretty safe. >>>>>>>> >> >>>>>>>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal < >>>>>>>> samee...@apache.org> >>>>>>>> >> wrote: >>>>>>>> >> > This RC has failed due to >>>>>>>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>>>>>>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), >>>>>>>> I'll follow >>>>>>>> >> > up >>>>>>>> >> > with an RC5 soon. >>>>>>>> >> > >>>>>>>> >> > On 20 February 2018 at 16:49, Ryan Blue >>>>>>>> wrote: >>>>>>>> >> >> >>>>>>>> >> >> +1 >>>>>>>> >> >> >>>>>>>> >> >> Build & tests look fine, checked signature and checksums for >>>>>>>> src >>>>>>>> >> >> tarball. >>>>>>>> >> >> >>>>>>>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>>>>>>> >> >> wrote: >>>>>>>> >> >>> >>>>>>>> >> >>> I'm -1 because of the UI regression >>>>>>>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All >>>>>>>> Jobs page >>>>>>>> >> >>> may be >>>>>>>> >> >>> too slow and cause "read timeout" when there are lots of >>>>>>>> jobs and >>>>>>>> >> >>> stages. >>>>>>>> >> >>> This is one of the most important pages because when it's >>>>>>>> broken, it's >>>>>>>> >> >>> pretty hard to use Spark Web UI. >>>>>>>> >> >>> >>>>>>>> >> >>> >>>>>>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>>>>>> marcogaid...@gmail.com> >>>>>>>> >> >>> wrote: >>>>>>>> >> >>>> >>>>>>>> >> >>>> +1 >>>>>>>> >> >>>> >>>>>>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon < >>>>>>>> gurwls...@gmail.com>: >>>>>>>> >> >>>>> >>>>>>>> >> >>>>> +1 too >>>>>>>> >> >>>>> >>>>>>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>>>>>>> ues...@happy-camper.st>: >>>>>>>> >> >>>>>> >>>>>>>> >> >>>>>> +1 >>>>>>>> >> >>>>>> >>>>>>>> >> >>>>>> >>>>>>>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>>>>> >> >>>>>> >>>>>>>> >> >>>>>> wrote: >>>>>>>> >> >>>>>>> >>>>>>>> >> >>>>>>> +1 >>>>>>>> >> >>>>>>> >>>>>>>> >> >>>>>>> >>&
Re: [VOTE] Spark 2.3.0 (RC4)
;>>>>> >> >> >>>>>> >> >> Build & tests look fine, checked signature and checksums for src >>>>>> >> >> tarball. >>>>>> >> >> >>>>>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>>>>> >> >> wrote: >>>>>> >> >>> >>>>>> >> >>> I'm -1 because of the UI regression >>>>>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All >>>>>> Jobs page >>>>>> >> >>> may be >>>>>> >> >>> too slow and cause "read timeout" when there are lots of jobs >>>>>> and >>>>>> >> >>> stages. >>>>>> >> >>> This is one of the most important pages because when it's >>>>>> broken, it's >>>>>> >> >>> pretty hard to use Spark Web UI. >>>>>> >> >>> >>>>>> >> >>> >>>>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>>>> marcogaid...@gmail.com> >>>>>> >> >>> wrote: >>>>>> >> >>>> >>>>>> >> >>>> +1 >>>>>> >> >>>> >>>>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon >>>>> >: >>>>>> >> >>>>> >>>>>> >> >>>>> +1 too >>>>>> >> >>>>> >>>>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>>>>> ues...@happy-camper.st>: >>>>>> >> >>>>>> >>>>>> >> >>>>>> +1 >>>>>> >> >>>>>> >>>>>> >> >>>>>> >>>>>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>>> >> >>>>>> >>>>>> >> >>>>>> wrote: >>>>>> >> >>>>>>> >>>>>> >> >>>>>>> +1 >>>>>> >> >>>>>>> >>>>>> >> >>>>>>> >>>>>> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>>> >> >>>>>>>> >>>>>> >> >>>>>>>> +1 >>>>>> >> >>>>>>>> >>>>>> >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>>> >> >>>>>>>> >>>>>> >> >>>>>>>> wrote: >>>>>> >> >>>>>>>>> >>>>>> >> >>>>>>>>> +1 >>>>>> >> >>>>>>>>> >>>>>> >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >>>>>> >> >>>>>>>>> , wrote: >>>>>> >> >>>>>>>>>> >>>>>> >> >>>>>>>>>> this file shouldn't be included? >>>>>> >> >>>>>>>>>> >>>>>> >> >>>>>>>>>> https://dist.apache.org/repos/ >>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>> >> >>>>>>>>> >>>>>> >> >>>>>>>>> >>>>>> >> >>>>>>>>> I've now deleted this file >>>>>> >> >>>>>>>>> >>>>>> >> >>>>>>>>>> From: Sameer Agarwal >>>>>> >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM >>>>>> >> >>>>>>>>>> To: Sameer Agarwal >>>>>> >> >>>>>>>>>> Cc: dev >>>>>> >>
Re: [VOTE] Spark 2.3.0 (RC4)
>>>>> >> >>> may be >>>>> >> >>> too slow and cause "read timeout" when there are lots of jobs >>>>> and >>>>> >> >>> stages. >>>>> >> >>> This is one of the most important pages because when it's >>>>> broken, it's >>>>> >> >>> pretty hard to use Spark Web UI. >>>>> >> >>> >>>>> >> >>> >>>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>>> marcogaid...@gmail.com> >>>>> >> >>> wrote: >>>>> >> >>>> >>>>> >> >>>> +1 >>>>> >> >>>> >>>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>>>> >> >>>>> >>>>> >> >>>>> +1 too >>>>> >> >>>>> >>>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>>>> ues...@happy-camper.st>: >>>>> >> >>>>>> >>>>> >> >>>>>> +1 >>>>> >> >>>>>> >>>>> >> >>>>>> >>>>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>> >> >>>>>> >>>>> >> >>>>>> wrote: >>>>> >> >>>>>>> >>>>> >> >>>>>>> +1 >>>>> >> >>>>>>> >>>>> >> >>>>>>> >>>>> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>> >> >>>>>>>> >>>>> >> >>>>>>>> +1 >>>>> >> >>>>>>>> >>>>> >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>> >> >>>>>>>> >>>>> >> >>>>>>>> wrote: >>>>> >> >>>>>>>>> >>>>> >> >>>>>>>>> +1 >>>>> >> >>>>>>>>> >>>>> >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >>>>> >> >>>>>>>>> , wrote: >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> this file shouldn't be included? >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> https://dist.apache.org/repos/ >>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>> >> >>>>>>>>> >>>>> >> >>>>>>>>> >>>>> >> >>>>>>>>> I've now deleted this file >>>>> >> >>>>>>>>> >>>>> >> >>>>>>>>>> From: Sameer Agarwal >>>>> >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM >>>>> >> >>>>>>>>>> To: Sameer Agarwal >>>>> >> >>>>>>>>>> Cc: dev >>>>> >> >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> I'll start with a +1 once again. >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> All blockers reported against RC3 have been resolved and >>>>> the >>>>> >> >>>>>>>>>> builds are healthy. >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>> >> >>>>>>>>>> >>>>> >> >>>>>>>>>> wrote: >>>>> >> >>>>>>>>>>> >>>>> &
Re: [VOTE] Spark 2.3.0 (RC4)
No problem if we can't add them, this is experimental anyway so this release should be more about validating the API and the start of our implementation. I just don't think we can recommend that anyone actually use DataSourceV2 without these patches. On Wed, Feb 21, 2018 at 9:21 AM, Wenchen Fan wrote: > SPARK-23323 adds a new API, I'm not sure we can still do it at this stage > of the release... Besides users can work around it by calling the spark > output coordinator themselves in their data source. > > SPARK-23203 is non-trivial and didn't fix any known bugs, so it's hard to > convince other people that it's safe to add it to the release during the RC > phase. > > SPARK-23418 depends on the above one. > > Generally they are good to have in Spark 2.3, if they were merged before > the RC. I think this is a lesson we should learn from, that we should work > on stuff we want in the release before the RC, instead of after. > > On Thu, Feb 22, 2018 at 1:01 AM, Ryan Blue > wrote: > >> What does everyone think about getting some of the newer DataSourceV2 >> improvements in? It should be low risk because it is a new code path, and >> v2 isn't very usable without things like support for using the output >> commit coordinator to deconflict writes. >> >> The ones I'd like to get in are: >> * Use the output commit coordinator: https://issues.ap >> ache.org/jira/browse/SPARK-23323 >> * Use immutable trees and the same push-down logic as other read paths: >> https://issues.apache.org/jira/browse/SPARK-23203 >> * Don't allow users to supply schemas when they aren't supported: >> https://issues.apache.org/jira/browse/SPARK-23418 >> >> I think it would make the 2.3.0 release more usable for anyone interested >> in the v2 read and write paths. >> >> Thanks! >> >> On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu >> wrote: >> >>> +1 >>> >>> On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin >>> wrote: >>> >>>> Done, thanks! >>>> >>>> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >>>> wrote: >>>> > Sure, please feel free to backport. >>>> > >>>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>>> wrote: >>>> >> >>>> >> Hey Sameer, >>>> >> >>>> >> Mind including https://github.com/apache/spark/pull/20643 >>>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >>>> >> with older shuffle services, but it's pretty safe. >>>> >> >>>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >>> > >>>> >> wrote: >>>> >> > This RC has failed due to >>>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>>> follow >>>> >> > up >>>> >> > with an RC5 soon. >>>> >> > >>>> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >>>> >> >> >>>> >> >> +1 >>>> >> >> >>>> >> >> Build & tests look fine, checked signature and checksums for src >>>> >> >> tarball. >>>> >> >> >>>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>>> >> >> wrote: >>>> >> >>> >>>> >> >>> I'm -1 because of the UI regression >>>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All >>>> Jobs page >>>> >> >>> may be >>>> >> >>> too slow and cause "read timeout" when there are lots of jobs and >>>> >> >>> stages. >>>> >> >>> This is one of the most important pages because when it's >>>> broken, it's >>>> >> >>> pretty hard to use Spark Web UI. >>>> >> >>> >>>> >> >>> >>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>> marcogaid...@gmail.com> >>>> >> >>> wrote: >>>> >> >>>> >>>> >> >>>> +1 >>>> >> >>>> >>>> >> >>>> 2018-
Re: [VOTE] Spark 2.3.0 (RC4)
SPARK-23323 adds a new API, I'm not sure we can still do it at this stage of the release... Besides users can work around it by calling the spark output coordinator themselves in their data source. SPARK-23203 is non-trivial and didn't fix any known bugs, so it's hard to convince other people that it's safe to add it to the release during the RC phase. SPARK-23418 depends on the above one. Generally they are good to have in Spark 2.3, if they were merged before the RC. I think this is a lesson we should learn from, that we should work on stuff we want in the release before the RC, instead of after. On Thu, Feb 22, 2018 at 1:01 AM, Ryan Blue wrote: > What does everyone think about getting some of the newer DataSourceV2 > improvements in? It should be low risk because it is a new code path, and > v2 isn't very usable without things like support for using the output > commit coordinator to deconflict writes. > > The ones I'd like to get in are: > * Use the output commit coordinator: https://issues.ap > ache.org/jira/browse/SPARK-23323 > * Use immutable trees and the same push-down logic as other read paths: > https://issues.apache.org/jira/browse/SPARK-23203 > * Don't allow users to supply schemas when they aren't supported: > https://issues.apache.org/jira/browse/SPARK-23418 > > I think it would make the 2.3.0 release more usable for anyone interested > in the v2 read and write paths. > > Thanks! > > On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu > wrote: > >> +1 >> >> On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin >> wrote: >> >>> Done, thanks! >>> >>> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >>> wrote: >>> > Sure, please feel free to backport. >>> > >>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>> wrote: >>> >> >>> >> Hey Sameer, >>> >> >>> >> Mind including https://github.com/apache/spark/pull/20643 >>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >>> >> with older shuffle services, but it's pretty safe. >>> >> >>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >>> >> wrote: >>> >> > This RC has failed due to >>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>> follow >>> >> > up >>> >> > with an RC5 soon. >>> >> > >>> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >>> >> >> >>> >> >> +1 >>> >> >> >>> >> >> Build & tests look fine, checked signature and checksums for src >>> >> >> tarball. >>> >> >> >>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>> >> >> wrote: >>> >> >>> >>> >> >>> I'm -1 because of the UI regression >>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs >>> page >>> >> >>> may be >>> >> >>> too slow and cause "read timeout" when there are lots of jobs and >>> >> >>> stages. >>> >> >>> This is one of the most important pages because when it's broken, >>> it's >>> >> >>> pretty hard to use Spark Web UI. >>> >> >>> >>> >> >>> >>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>> marcogaid...@gmail.com> >>> >> >>> wrote: >>> >> >>>> >>> >> >>>> +1 >>> >> >>>> >>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>> >> >>>>> >>> >> >>>>> +1 too >>> >> >>>>> >>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>> ues...@happy-camper.st>: >>> >> >>>>>> >>> >> >>>>>> +1 >>> >> >>>>>> >>> >> >>>>>> >>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>> >> >>>>>> >>> >> >>>>>> wrote: >>> >> >>>>
Re: [VOTE] Spark 2.3.0 (RC4)
jin Kwon >> >>>> <mailto:gurwls...@gmail.com>>: > >> >>>>> > >> >>>>> +1 too > >> >>>>> > >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN >> >>>>> <mailto:ues...@happy-camper.st>>: > >> >>>>>> > >> >>>>>> +1 > >> >>>>>> > >> >>>>>> > >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang > >> >>>>>> mailto:jiangxb1...@gmail.com>> > >> >>>>>> wrote: > >> >>>>>>> > >> >>>>>>> +1 > >> >>>>>>> > >> >>>>>>> > >> >>>>>>> Wenchen Fan >> >>>>>>> <mailto:cloud0...@gmail.com>>于2018年2月20日 周二下午1:09写道: > >> >>>>>>>> > >> >>>>>>>> +1 > >> >>>>>>>> > >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin > >> >>>>>>>> mailto:r...@databricks.com>> > >> >>>>>>>> wrote: > >> >>>>>>>>> > >> >>>>>>>>> +1 > >> >>>>>>>>> > >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal > >> >>>>>>>>> mailto:sameer.a...@gmail.com>>, wrote: > >> >>>>>>>>>> > >> >>>>>>>>>> this file shouldn't be included? > >> >>>>>>>>>> > >> >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml > >> >>>>>>>>>> > >> >>>>>>>>>> <https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml> > >> >>>>>>>>> > >> >>>>>>>>> > >> >>>>>>>>> I've now deleted this file > >> >>>>>>>>> > >> >>>>>>>>>> From: Sameer Agarwal >> >>>>>>>>>> <mailto:sameer.a...@gmail.com>> > >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM > >> >>>>>>>>>> To: Sameer Agarwal > >> >>>>>>>>>> Cc: dev > >> >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) > >> >>>>>>>>>> > >> >>>>>>>>>> I'll start with a +1 once again. > >> >>>>>>>>>> > >> >>>>>>>>>> All blockers reported against RC3 have been resolved and the > >> >>>>>>>>>> builds are healthy. > >> >>>>>>>>>> > >> >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal > >> >>>>>>>>>> mailto:samee...@apache.org>> > >> >>>>>>>>>> wrote: > >> >>>>>>>>>>> > >> >>>>>>>>>>> Please vote on releasing the following candidate as Apache > >> >>>>>>>>>>> Spark > >> >>>>>>>>>>> version 2.3.0. The vote is open until Thursday February 22, > >> >>>>>>>>>>> 2018 at 8:00:00 > >> >>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are > >> >>>>>>>>>>> cast. > >> >>>>>>>>>>> > >> >>>>>>>>>>> > >> >>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 > >> >>>>>>>>>>> > >> >>>>>>>>>>> [ ] -1 Do not release this package because ... > >> >>>>>>>>>>> > >> >>>>>>>>>>> > >> >>>>>>>>>>> To learn more about Apache Spark, please see > >> >
Re: [VOTE] Spark 2.3.0 (RC4)
SPARK-23406 fixes a bug of a new feature in Spark 2.3, which is not a regression. I think we have to fix it in 2.3.1, but I'm less sure about 2.3.0. On Thu, Feb 22, 2018 at 1:21 AM, kant kodali wrote: > Hi All, > > +1 for the tickets proposed by Ryan Blue > > Any possible chance of this one https://issues.apache.org/ > jira/browse/SPARK-23406 getting into 2.3.0? It's a very important feature > for us so if it doesn't make the cut I would have to cherry-pick this > commit and compile from the source for our production release. > > Thanks! > > On Wed, Feb 21, 2018 at 9:01 AM, Ryan Blue > wrote: > >> What does everyone think about getting some of the newer DataSourceV2 >> improvements in? It should be low risk because it is a new code path, and >> v2 isn't very usable without things like support for using the output >> commit coordinator to deconflict writes. >> >> The ones I'd like to get in are: >> * Use the output commit coordinator: https://issues.ap >> ache.org/jira/browse/SPARK-23323 >> * Use immutable trees and the same push-down logic as other read paths: >> https://issues.apache.org/jira/browse/SPARK-23203 >> * Don't allow users to supply schemas when they aren't supported: >> https://issues.apache.org/jira/browse/SPARK-23418 >> >> I think it would make the 2.3.0 release more usable for anyone interested >> in the v2 read and write paths. >> >> Thanks! >> >> On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu >> wrote: >> >>> +1 >>> >>> On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin >>> wrote: >>> >>>> Done, thanks! >>>> >>>> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >>>> wrote: >>>> > Sure, please feel free to backport. >>>> > >>>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>>> wrote: >>>> >> >>>> >> Hey Sameer, >>>> >> >>>> >> Mind including https://github.com/apache/spark/pull/20643 >>>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >>>> >> with older shuffle services, but it's pretty safe. >>>> >> >>>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >>> > >>>> >> wrote: >>>> >> > This RC has failed due to >>>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>>> follow >>>> >> > up >>>> >> > with an RC5 soon. >>>> >> > >>>> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >>>> >> >> >>>> >> >> +1 >>>> >> >> >>>> >> >> Build & tests look fine, checked signature and checksums for src >>>> >> >> tarball. >>>> >> >> >>>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>>> >> >> wrote: >>>> >> >>> >>>> >> >>> I'm -1 because of the UI regression >>>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All >>>> Jobs page >>>> >> >>> may be >>>> >> >>> too slow and cause "read timeout" when there are lots of jobs and >>>> >> >>> stages. >>>> >> >>> This is one of the most important pages because when it's >>>> broken, it's >>>> >> >>> pretty hard to use Spark Web UI. >>>> >> >>> >>>> >> >>> >>>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>>> marcogaid...@gmail.com> >>>> >> >>> wrote: >>>> >> >>>> >>>> >> >>>> +1 >>>> >> >>>> >>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>>> >> >>>>> >>>> >> >>>>> +1 too >>>> >> >>>>> >>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>>> ues...@happy-camper.st>: >>>> >> >>>>>> >>>> >> >>>>>
Re: [VOTE] Spark 2.3.0 (RC4)
Hi All, +1 for the tickets proposed by Ryan Blue Any possible chance of this one https://issues.apache.org/jira/browse/SPARK-23406 getting into 2.3.0? It's a very important feature for us so if it doesn't make the cut I would have to cherry-pick this commit and compile from the source for our production release. Thanks! On Wed, Feb 21, 2018 at 9:01 AM, Ryan Blue wrote: > What does everyone think about getting some of the newer DataSourceV2 > improvements in? It should be low risk because it is a new code path, and > v2 isn't very usable without things like support for using the output > commit coordinator to deconflict writes. > > The ones I'd like to get in are: > * Use the output commit coordinator: https://issues. > apache.org/jira/browse/SPARK-23323 > * Use immutable trees and the same push-down logic as other read paths: > https://issues.apache.org/jira/browse/SPARK-23203 > * Don't allow users to supply schemas when they aren't supported: > https://issues.apache.org/jira/browse/SPARK-23418 > > I think it would make the 2.3.0 release more usable for anyone interested > in the v2 read and write paths. > > Thanks! > > On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu > wrote: > >> +1 >> >> On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin >> wrote: >> >>> Done, thanks! >>> >>> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >>> wrote: >>> > Sure, please feel free to backport. >>> > >>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>> wrote: >>> >> >>> >> Hey Sameer, >>> >> >>> >> Mind including https://github.com/apache/spark/pull/20643 >>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >>> >> with older shuffle services, but it's pretty safe. >>> >> >>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >>> >> wrote: >>> >> > This RC has failed due to >>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>> follow >>> >> > up >>> >> > with an RC5 soon. >>> >> > >>> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >>> >> >> >>> >> >> +1 >>> >> >> >>> >> >> Build & tests look fine, checked signature and checksums for src >>> >> >> tarball. >>> >> >> >>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>> >> >> wrote: >>> >> >>> >>> >> >>> I'm -1 because of the UI regression >>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs >>> page >>> >> >>> may be >>> >> >>> too slow and cause "read timeout" when there are lots of jobs and >>> >> >>> stages. >>> >> >>> This is one of the most important pages because when it's broken, >>> it's >>> >> >>> pretty hard to use Spark Web UI. >>> >> >>> >>> >> >>> >>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>> marcogaid...@gmail.com> >>> >> >>> wrote: >>> >> >>>> >>> >> >>>> +1 >>> >> >>>> >>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>> >> >>>>> >>> >> >>>>> +1 too >>> >> >>>>> >>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>> ues...@happy-camper.st>: >>> >> >>>>>> >>> >> >>>>>> +1 >>> >> >>>>>> >>> >> >>>>>> >>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>> >> >>>>>> >>> >> >>>>>> wrote: >>> >> >>>>>>> >>> >> >>>>>>> +1 >>> >> >>>>>>> >>> >> >>>>>>> >>> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>
Re: [VOTE] Spark 2.3.0 (RC4)
Hi, Ryan, Thank you for bringing it up. Since it is in the RC4 already, we only can accept the regression fixes in the 2.3 branch. This is also the strategy in the previous Spark releases. Data source APIs V2 is newly introduced in this release. In this stage, we are unable to accept any change in data source APIs V2. We have to stop adding new features/changes into the to-be-released branches. Sorry for that. Thanks, Xiao 2018-02-21 9:01 GMT-08:00 Ryan Blue : > What does everyone think about getting some of the newer DataSourceV2 > improvements in? It should be low risk because it is a new code path, and > v2 isn't very usable without things like support for using the output > commit coordinator to deconflict writes. > > The ones I'd like to get in are: > * Use the output commit coordinator: https://issues. > apache.org/jira/browse/SPARK-23323 > * Use immutable trees and the same push-down logic as other read paths: > https://issues.apache.org/jira/browse/SPARK-23203 > * Don't allow users to supply schemas when they aren't supported: > https://issues.apache.org/jira/browse/SPARK-23418 > > I think it would make the 2.3.0 release more usable for anyone interested > in the v2 read and write paths. > > Thanks! > > On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu > wrote: > >> +1 >> >> On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin >> wrote: >> >>> Done, thanks! >>> >>> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >>> wrote: >>> > Sure, please feel free to backport. >>> > >>> > On 20 February 2018 at 18:02, Marcelo Vanzin >>> wrote: >>> >> >>> >> Hey Sameer, >>> >> >>> >> Mind including https://github.com/apache/spark/pull/20643 >>> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >>> >> with older shuffle services, but it's pretty safe. >>> >> >>> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >>> >> wrote: >>> >> > This RC has failed due to >>> >> > https://issues.apache.org/jira/browse/SPARK-23470. >>> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >>> follow >>> >> > up >>> >> > with an RC5 soon. >>> >> > >>> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >>> >> >> >>> >> >> +1 >>> >> >> >>> >> >> Build & tests look fine, checked signature and checksums for src >>> >> >> tarball. >>> >> >> >>> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >>> >> >> wrote: >>> >> >>> >>> >> >>> I'm -1 because of the UI regression >>> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs >>> page >>> >> >>> may be >>> >> >>> too slow and cause "read timeout" when there are lots of jobs and >>> >> >>> stages. >>> >> >>> This is one of the most important pages because when it's broken, >>> it's >>> >> >>> pretty hard to use Spark Web UI. >>> >> >>> >>> >> >>> >>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >>> marcogaid...@gmail.com> >>> >> >>> wrote: >>> >> >>>> >>> >> >>>> +1 >>> >> >>>> >>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>> >> >>>>> >>> >> >>>>> +1 too >>> >> >>>>> >>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN < >>> ues...@happy-camper.st>: >>> >> >>>>>> >>> >> >>>>>> +1 >>> >> >>>>>> >>> >> >>>>>> >>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>> >> >>>>>> >>> >> >>>>>> wrote: >>> >> >>>>>>> >>> >> >>>>>>> +1 >>> >> >>>>>>> >>> >> >>>>>>> >>> >> >&
Re: [VOTE] Spark 2.3.0 (RC4)
What does everyone think about getting some of the newer DataSourceV2 improvements in? It should be low risk because it is a new code path, and v2 isn't very usable without things like support for using the output commit coordinator to deconflict writes. The ones I'd like to get in are: * Use the output commit coordinator: https://issues.apache.org/jira/browse/SPARK-23323 * Use immutable trees and the same push-down logic as other read paths: https://issues.apache.org/jira/browse/SPARK-23203 * Don't allow users to supply schemas when they aren't supported: https://issues.apache.org/jira/browse/SPARK-23418 I think it would make the 2.3.0 release more usable for anyone interested in the v2 read and write paths. Thanks! On Tue, Feb 20, 2018 at 7:07 PM, Weichen Xu wrote: > +1 > > On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin > wrote: > >> Done, thanks! >> >> On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal >> wrote: >> > Sure, please feel free to backport. >> > >> > On 20 February 2018 at 18:02, Marcelo Vanzin >> wrote: >> >> >> >> Hey Sameer, >> >> >> >> Mind including https://github.com/apache/spark/pull/20643 >> >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >> >> with older shuffle services, but it's pretty safe. >> >> >> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >> >> wrote: >> >> > This RC has failed due to >> >> > https://issues.apache.org/jira/browse/SPARK-23470. >> >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll >> follow >> >> > up >> >> > with an RC5 soon. >> >> > >> >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >> >> >> >> >> >> +1 >> >> >> >> >> >> Build & tests look fine, checked signature and checksums for src >> >> >> tarball. >> >> >> >> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >> >> >> wrote: >> >> >>> >> >> >>> I'm -1 because of the UI regression >> >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs >> page >> >> >>> may be >> >> >>> too slow and cause "read timeout" when there are lots of jobs and >> >> >>> stages. >> >> >>> This is one of the most important pages because when it's broken, >> it's >> >> >>> pretty hard to use Spark Web UI. >> >> >>> >> >> >>> >> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < >> marcogaid...@gmail.com> >> >> >>> wrote: >> >> >>>> >> >> >>>> +1 >> >> >>>> >> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >> >> >>>>> >> >> >>>>> +1 too >> >> >>>>> >> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN > >: >> >> >>>>>> >> >> >>>>>> +1 >> >> >>>>>> >> >> >>>>>> >> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >> >> >>>>>> >> >> >>>>>> wrote: >> >> >>>>>>> >> >> >>>>>>> +1 >> >> >>>>>>> >> >> >>>>>>> >> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >> >> >>>>>>>> >> >> >>>>>>>> +1 >> >> >>>>>>>> >> >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >> >> >>>>>>>> >> >> >>>>>>>> wrote: >> >> >>>>>>>>> >> >> >>>>>>>>> +1 >> >> >>>>>>>>> >> >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >> >> >>>>>>>>> , wrote: >> >> >>>>>>>>>> >> >> >>>>>>>>>> this file sho
Re: [VOTE] Spark 2.3.0 (RC4)
+1 On Wed, Feb 21, 2018 at 10:07 AM, Marcelo Vanzin wrote: > Done, thanks! > > On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal > wrote: > > Sure, please feel free to backport. > > > > On 20 February 2018 at 18:02, Marcelo Vanzin > wrote: > >> > >> Hey Sameer, > >> > >> Mind including https://github.com/apache/spark/pull/20643 > >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it > >> with older shuffle services, but it's pretty safe. > >> > >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal > >> wrote: > >> > This RC has failed due to > >> > https://issues.apache.org/jira/browse/SPARK-23470. > >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll follow > >> > up > >> > with an RC5 soon. > >> > > >> > On 20 February 2018 at 16:49, Ryan Blue wrote: > >> >> > >> >> +1 > >> >> > >> >> Build & tests look fine, checked signature and checksums for src > >> >> tarball. > >> >> > >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu > >> >> wrote: > >> >>> > >> >>> I'm -1 because of the UI regression > >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs > page > >> >>> may be > >> >>> too slow and cause "read timeout" when there are lots of jobs and > >> >>> stages. > >> >>> This is one of the most important pages because when it's broken, > it's > >> >>> pretty hard to use Spark Web UI. > >> >>> > >> >>> > >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido < > marcogaid...@gmail.com> > >> >>> wrote: > >> >>>> > >> >>>> +1 > >> >>>> > >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : > >> >>>>> > >> >>>>> +1 too > >> >>>>> > >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN >: > >> >>>>>> > >> >>>>>> +1 > >> >>>>>> > >> >>>>>> > >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang > >> >>>>>> > >> >>>>>> wrote: > >> >>>>>>> > >> >>>>>>> +1 > >> >>>>>>> > >> >>>>>>> > >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: > >> >>>>>>>> > >> >>>>>>>> +1 > >> >>>>>>>> > >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin > >> >>>>>>>> > >> >>>>>>>> wrote: > >> >>>>>>>>> > >> >>>>>>>>> +1 > >> >>>>>>>>> > >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal > >> >>>>>>>>> , wrote: > >> >>>>>>>>>> > >> >>>>>>>>>> this file shouldn't be included? > >> >>>>>>>>>> > >> >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ > spark-parent_2.11.iml > >> >>>>>>>>> > >> >>>>>>>>> > >> >>>>>>>>> I've now deleted this file > >> >>>>>>>>> > >> >>>>>>>>>> From: Sameer Agarwal > >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM > >> >>>>>>>>>> To: Sameer Agarwal > >> >>>>>>>>>> Cc: dev > >> >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) > >> >>>>>>>>>> > >> >>>>>>>>>> I'll start with a +1 once again. > >> >>>>>>>>>> > >> >>>>>>>>&
Re: [VOTE] Spark 2.3.0 (RC4)
Done, thanks! On Tue, Feb 20, 2018 at 6:05 PM, Sameer Agarwal wrote: > Sure, please feel free to backport. > > On 20 February 2018 at 18:02, Marcelo Vanzin wrote: >> >> Hey Sameer, >> >> Mind including https://github.com/apache/spark/pull/20643 >> (SPARK-23468) in the new RC? It's a minor bug since I've only hit it >> with older shuffle services, but it's pretty safe. >> >> On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal >> wrote: >> > This RC has failed due to >> > https://issues.apache.org/jira/browse/SPARK-23470. >> > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll follow >> > up >> > with an RC5 soon. >> > >> > On 20 February 2018 at 16:49, Ryan Blue wrote: >> >> >> >> +1 >> >> >> >> Build & tests look fine, checked signature and checksums for src >> >> tarball. >> >> >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >> >> wrote: >> >>> >> >>> I'm -1 because of the UI regression >> >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs page >> >>> may be >> >>> too slow and cause "read timeout" when there are lots of jobs and >> >>> stages. >> >>> This is one of the most important pages because when it's broken, it's >> >>> pretty hard to use Spark Web UI. >> >>> >> >>> >> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido >> >>> wrote: >> >>>> >> >>>> +1 >> >>>> >> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >> >>>>> >> >>>>> +1 too >> >>>>> >> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : >> >>>>>> >> >>>>>> +1 >> >>>>>> >> >>>>>> >> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >> >>>>>> >> >>>>>> wrote: >> >>>>>>> >> >>>>>>> +1 >> >>>>>>> >> >>>>>>> >> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >> >>>>>>>> >> >>>>>>>> +1 >> >>>>>>>> >> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >> >>>>>>>> >> >>>>>>>> wrote: >> >>>>>>>>> >> >>>>>>>>> +1 >> >>>>>>>>> >> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >> >>>>>>>>> , wrote: >> >>>>>>>>>> >> >>>>>>>>>> this file shouldn't be included? >> >>>>>>>>>> >> >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >> >>>>>>>>> >> >>>>>>>>> >> >>>>>>>>> I've now deleted this file >> >>>>>>>>> >> >>>>>>>>>> From: Sameer Agarwal >> >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM >> >>>>>>>>>> To: Sameer Agarwal >> >>>>>>>>>> Cc: dev >> >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >> >>>>>>>>>> >> >>>>>>>>>> I'll start with a +1 once again. >> >>>>>>>>>> >> >>>>>>>>>> All blockers reported against RC3 have been resolved and the >> >>>>>>>>>> builds are healthy. >> >>>>>>>>>> >> >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >> >>>>>>>>>> >> >>>>>>>>>> wrote: >> >>>>>>>>>>> >> >>>>>>>>>>> Please vote on releasing the following candidate as Apache >> >>>>>>>>>>> Spark >>
Re: [VOTE] Spark 2.3.0 (RC4)
Sure, please feel free to backport. On 20 February 2018 at 18:02, Marcelo Vanzin wrote: > Hey Sameer, > > Mind including https://github.com/apache/spark/pull/20643 > (SPARK-23468) in the new RC? It's a minor bug since I've only hit it > with older shuffle services, but it's pretty safe. > > On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal > wrote: > > This RC has failed due to https://issues.apache.org/ > jira/browse/SPARK-23470. > > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll follow up > > with an RC5 soon. > > > > On 20 February 2018 at 16:49, Ryan Blue wrote: > >> > >> +1 > >> > >> Build & tests look fine, checked signature and checksums for src > tarball. > >> > >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu > >> wrote: > >>> > >>> I'm -1 because of the UI regression > >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs page > may be > >>> too slow and cause "read timeout" when there are lots of jobs and > stages. > >>> This is one of the most important pages because when it's broken, it's > >>> pretty hard to use Spark Web UI. > >>> > >>> > >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido > >>> wrote: > >>>> > >>>> +1 > >>>> > >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : > >>>>> > >>>>> +1 too > >>>>> > >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : > >>>>>> > >>>>>> +1 > >>>>>> > >>>>>> > >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang < > jiangxb1...@gmail.com> > >>>>>> wrote: > >>>>>>> > >>>>>>> +1 > >>>>>>> > >>>>>>> > >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: > >>>>>>>> > >>>>>>>> +1 > >>>>>>>> > >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin < > r...@databricks.com> > >>>>>>>> wrote: > >>>>>>>>> > >>>>>>>>> +1 > >>>>>>>>> > >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal > >>>>>>>>> , wrote: > >>>>>>>>>> > >>>>>>>>>> this file shouldn't be included? > >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ > spark-parent_2.11.iml > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> I've now deleted this file > >>>>>>>>> > >>>>>>>>>> From: Sameer Agarwal > >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM > >>>>>>>>>> To: Sameer Agarwal > >>>>>>>>>> Cc: dev > >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) > >>>>>>>>>> > >>>>>>>>>> I'll start with a +1 once again. > >>>>>>>>>> > >>>>>>>>>> All blockers reported against RC3 have been resolved and the > >>>>>>>>>> builds are healthy. > >>>>>>>>>> > >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal < > samee...@apache.org> > >>>>>>>>>> wrote: > >>>>>>>>>>> > >>>>>>>>>>> Please vote on releasing the following candidate as Apache > Spark > >>>>>>>>>>> version 2.3.0. The vote is open until Thursday February 22, > 2018 at 8:00:00 > >>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are > cast. > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 > >>>>>>>>>>> > >>>>>>>>>>> [ ] -1 Do not release this package because ... > >>>>>>>&
Re: [VOTE] Spark 2.3.0 (RC4)
Hey Sameer, Mind including https://github.com/apache/spark/pull/20643 (SPARK-23468) in the new RC? It's a minor bug since I've only hit it with older shuffle services, but it's pretty safe. On Tue, Feb 20, 2018 at 5:58 PM, Sameer Agarwal wrote: > This RC has failed due to https://issues.apache.org/jira/browse/SPARK-23470. > Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll follow up > with an RC5 soon. > > On 20 February 2018 at 16:49, Ryan Blue wrote: >> >> +1 >> >> Build & tests look fine, checked signature and checksums for src tarball. >> >> On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu >> wrote: >>> >>> I'm -1 because of the UI regression >>> https://issues.apache.org/jira/browse/SPARK-23470 : the All Jobs page may be >>> too slow and cause "read timeout" when there are lots of jobs and stages. >>> This is one of the most important pages because when it's broken, it's >>> pretty hard to use Spark Web UI. >>> >>> >>> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido >>> wrote: >>>> >>>> +1 >>>> >>>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>>>> >>>>> +1 too >>>>> >>>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : >>>>>> >>>>>> +1 >>>>>> >>>>>> >>>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>>> wrote: >>>>>>> >>>>>>> +1 >>>>>>> >>>>>>> >>>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>>>>> >>>>>>>> +1 >>>>>>>> >>>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>>>>> wrote: >>>>>>>>> >>>>>>>>> +1 >>>>>>>>> >>>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal >>>>>>>>> , wrote: >>>>>>>>>> >>>>>>>>>> this file shouldn't be included? >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>>>>> >>>>>>>>> >>>>>>>>> I've now deleted this file >>>>>>>>> >>>>>>>>>> From: Sameer Agarwal >>>>>>>>>> Sent: Saturday, February 17, 2018 1:43:39 PM >>>>>>>>>> To: Sameer Agarwal >>>>>>>>>> Cc: dev >>>>>>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>>>>>>>> >>>>>>>>>> I'll start with a +1 once again. >>>>>>>>>> >>>>>>>>>> All blockers reported against RC3 have been resolved and the >>>>>>>>>> builds are healthy. >>>>>>>>>> >>>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>>>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>>>>>>> 8:00:00 >>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>>>>>>> >>>>>>>>>>> [ ] -1 Do not release this package because ... >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> To learn more about Apache Spark, please see >>>>>>>>>>> https://spark.apache.org/ >>>>>>>>>>> >>>>>>>>>>> The tag to be voted on is v2.3.0-rc4: >>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>>>>>>>>>> (44095cb65500739695b0324c177c19dfa1471472) >>>>>>>>>>> >&g
Re: [VOTE] Spark 2.3.0 (RC4)
This RC has failed due to https://issues.apache.org/jira/browse/SPARK-23470. Now that the fix has been merged in 2.3 (thanks Marcelo!), I'll follow up with an RC5 soon. On 20 February 2018 at 16:49, Ryan Blue wrote: > +1 > > Build & tests look fine, checked signature and checksums for src tarball. > > On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu < > shixi...@databricks.com> wrote: > >> I'm -1 because of the UI regression https://issues.apac >> he.org/jira/browse/SPARK-23470 : the All Jobs page may be too slow and >> cause "read timeout" when there are lots of jobs and stages. This is one of >> the most important pages because when it's broken, it's pretty hard to use >> Spark Web UI. >> >> >> On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido >> wrote: >> >>> +1 >>> >>> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >>> >>>> +1 too >>>> >>>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : >>>> >>>>> +1 >>>>> >>>>> >>>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>>> wrote: >>>>> >>>>>> +1 >>>>>> >>>>>> >>>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>>> >>>>>>> +1 >>>>>>> >>>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>>>> wrote: >>>>>>> >>>>>>>> +1 >>>>>>>> >>>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal < >>>>>>>> sameer.a...@gmail.com>, wrote: >>>>>>>> >>>>>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>>>>> >>>>>>>> >>>>>>>> I've now deleted this file >>>>>>>> >>>>>>>> *From:* Sameer Agarwal >>>>>>>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>>>>>>> *To:* Sameer Agarwal >>>>>>>>> *Cc:* dev >>>>>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>>>>> >>>>>>>>> I'll start with a +1 once again. >>>>>>>>> >>>>>>>>> All blockers reported against RC3 have been resolved and the >>>>>>>>> builds are healthy. >>>>>>>>> >>>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>>>>>> 8:00:00 >>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>>>>>> >>>>>>>>>> [ ] -1 Do not release this package because ... >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> To learn more about Apache Spark, please see >>>>>>>>>> https://spark.apache.org/ >>>>>>>>>> >>>>>>>>>> The tag to be voted on is v2.3.0-rc4: >>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>>>>>>>>> (44095cb65500739695b0324c177c19dfa1471472) >>>>>>>>>> >>>>>>>>>> List of JIRA tickets resolved in this release can be found here: >>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>>>>>>> >>>>>>>>>> The release files, including signatures, digests, etc. can be >>>>>>>>>> found at: >>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>>>>>>> >>>>>>>>>> Release artifacts are signed with the following key: &g
Re: [VOTE] Spark 2.3.0 (RC4)
+1 Build & tests look fine, checked signature and checksums for src tarball. On Tue, Feb 20, 2018 at 12:54 PM, Shixiong(Ryan) Zhu < shixi...@databricks.com> wrote: > I'm -1 because of the UI regression https://issues. > apache.org/jira/browse/SPARK-23470 : the All Jobs page may be too slow > and cause "read timeout" when there are lots of jobs and stages. This is > one of the most important pages because when it's broken, it's pretty hard > to use Spark Web UI. > > > On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido > wrote: > >> +1 >> >> 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : >> >>> +1 too >>> >>> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : >>> >>>> +1 >>>> >>>> >>>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>>> wrote: >>>> >>>>> +1 >>>>> >>>>> >>>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>>> >>>>>> +1 >>>>>> >>>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>>> wrote: >>>>>> >>>>>>> +1 >>>>>>> >>>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal < >>>>>>> sameer.a...@gmail.com>, wrote: >>>>>>> >>>>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>>>> >>>>>>> >>>>>>> I've now deleted this file >>>>>>> >>>>>>> *From:* Sameer Agarwal >>>>>>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>>>>>> *To:* Sameer Agarwal >>>>>>>> *Cc:* dev >>>>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>>>> >>>>>>>> I'll start with a +1 once again. >>>>>>>> >>>>>>>> All blockers reported against RC3 have been resolved and the builds >>>>>>>> are healthy. >>>>>>>> >>>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>>>>> wrote: >>>>>>>> >>>>>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>>>>> 8:00:00 >>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>>>>> >>>>>>>>> >>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>>>>> >>>>>>>>> [ ] -1 Do not release this package because ... >>>>>>>>> >>>>>>>>> >>>>>>>>> To learn more about Apache Spark, please see >>>>>>>>> https://spark.apache.org/ >>>>>>>>> >>>>>>>>> The tag to be voted on is v2.3.0-rc4: >>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>>>>>>>> (44095cb65500739695b0324c177c19dfa1471472) >>>>>>>>> >>>>>>>>> List of JIRA tickets resolved in this release can be found here: >>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>>>>>> >>>>>>>>> The release files, including signatures, digests, etc. can be >>>>>>>>> found at: >>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>>>>>> >>>>>>>>> Release artifacts are signed with the following key: >>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>>>>>>> >>>>>>>>> The staging repository for this release can be found at: >>>>>>>>> https://repository.apache.org/content/repositories/orgapache >>>>>>>>> spark-1265/ >>>>>>>>> >>>>>>>>> The documentation corresponding to this release can be found at: >>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs
Re: [VOTE] Spark 2.3.0 (RC4)
I'm -1 because of the UI regression https://issues.apache.org/jira /browse/SPARK-23470 : the All Jobs page may be too slow and cause "read timeout" when there are lots of jobs and stages. This is one of the most important pages because when it's broken, it's pretty hard to use Spark Web UI. On Tue, Feb 20, 2018 at 4:37 AM, Marco Gaido wrote: > +1 > > 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : > >> +1 too >> >> 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : >> >>> +1 >>> >>> >>> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >>> wrote: >>> >>>> +1 >>>> >>>> >>>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>>> >>>>> +1 >>>>> >>>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>>> wrote: >>>>> >>>>>> +1 >>>>>> >>>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , >>>>>> wrote: >>>>>> >>>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>>> >>>>>> >>>>>> I've now deleted this file >>>>>> >>>>>> *From:* Sameer Agarwal >>>>>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>>>>> *To:* Sameer Agarwal >>>>>>> *Cc:* dev >>>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>>> >>>>>>> I'll start with a +1 once again. >>>>>>> >>>>>>> All blockers reported against RC3 have been resolved and the builds >>>>>>> are healthy. >>>>>>> >>>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>>>> wrote: >>>>>>> >>>>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>>>> 8:00:00 >>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>>>> >>>>>>>> >>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>>>> >>>>>>>> [ ] -1 Do not release this package because ... >>>>>>>> >>>>>>>> >>>>>>>> To learn more about Apache Spark, please see >>>>>>>> https://spark.apache.org/ >>>>>>>> >>>>>>>> The tag to be voted on is v2.3.0-rc4: >>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>>>>>>> (44095cb65500739695b0324c177c19dfa1471472) >>>>>>>> >>>>>>>> List of JIRA tickets resolved in this release can be found here: >>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>>>>> >>>>>>>> The release files, including signatures, digests, etc. can be found >>>>>>>> at: >>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>>>>> >>>>>>>> Release artifacts are signed with the following key: >>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>>>>>> >>>>>>>> The staging repository for this release can be found at: >>>>>>>> https://repository.apache.org/content/repositories/orgapache >>>>>>>> spark-1265/ >>>>>>>> >>>>>>>> The documentation corresponding to this release can be found at: >>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>>>>> /_site/index.html >>>>>>>> >>>>>>>> >>>>>>>> FAQ >>>>>>>> >>>>>>>> === >>>>>>>> What are the unresolved issues targeted for 2.3.0? >>>>>>>> === >>>>>>>> >>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing, >>>>>>>> there are cur
Re: [VOTE] Spark 2.3.0 (RC4)
+1 2018-02-20 12:30 GMT+01:00 Hyukjin Kwon : > +1 too > > 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : > >> +1 >> >> >> On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang >> wrote: >> >>> +1 >>> >>> >>> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >>> >>>> +1 >>>> >>>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>>> wrote: >>>> >>>>> +1 >>>>> >>>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , >>>>> wrote: >>>>> >>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>> >>>>> >>>>> I've now deleted this file >>>>> >>>>> *From:* Sameer Agarwal >>>>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>>>> *To:* Sameer Agarwal >>>>>> *Cc:* dev >>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>> >>>>>> I'll start with a +1 once again. >>>>>> >>>>>> All blockers reported against RC3 have been resolved and the builds >>>>>> are healthy. >>>>>> >>>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>>> wrote: >>>>>> >>>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>>> 8:00:00 >>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>>> >>>>>>> >>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>>> >>>>>>> [ ] -1 Do not release this package because ... >>>>>>> >>>>>>> >>>>>>> To learn more about Apache Spark, please see >>>>>>> https://spark.apache.org/ >>>>>>> >>>>>>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar >>>>>>> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >>>>>>> >>>>>>> List of JIRA tickets resolved in this release can be found here: >>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>>>> >>>>>>> The release files, including signatures, digests, etc. can be found >>>>>>> at: >>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>>>> >>>>>>> Release artifacts are signed with the following key: >>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>>>>> >>>>>>> The staging repository for this release can be found at: >>>>>>> https://repository.apache.org/content/repositories/orgapache >>>>>>> spark-1265/ >>>>>>> >>>>>>> The documentation corresponding to this release can be found at: >>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>>>> /_site/index.html >>>>>>> >>>>>>> >>>>>>> FAQ >>>>>>> >>>>>>> === >>>>>>> What are the unresolved issues targeted for 2.3.0? >>>>>>> === >>>>>>> >>>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there >>>>>>> are currently no known release blockers. >>>>>>> >>>>>>> = >>>>>>> How can I help test this release? >>>>>>> = >>>>>>> >>>>>>> If you are a Spark user, you can help us test this release by taking >>>>>>> an existing Spark workload and running on this release candidate, then >>>>>>> reporting any regressions. >>>>>>> >>>>>>> If you're working in PySpark you can set up a virtual env and >>>>>>> install the current RC and see if anything important breaks, in the >>>>>>> Java/Scala you can add the staging repository to your projects resolvers >>>>>>> and test with the RC (make sure to clean up the artifact cache >>>>>>> before/after >>>>>>> so you don't end up building with a out of date RC going forward). >>>>>>> >>>>>>> === >>>>>>> What should happen to JIRA tickets still targeting 2.3.0? >>>>>>> === >>>>>>> >>>>>>> Committers should look at those and triage. Extremely important bug >>>>>>> fixes, documentation, and API tweaks that impact compatibility should be >>>>>>> worked on immediately. Everything else please retarget to 2.3.1 or >>>>>>> 2.4.0 as >>>>>>> appropriate. >>>>>>> >>>>>>> === >>>>>>> Why is my bug not fixed? >>>>>>> === >>>>>>> >>>>>>> In order to make timely releases, we will typically not hold the >>>>>>> release unless the bug in question is a regression from 2.2.0. That >>>>>>> being >>>>>>> said, if there is something which is a regression from 2.2.0 and has not >>>>>>> been correctly targeted please ping me or a committer to help target the >>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at >>>>>>> https://s.apache.org/WmoI). >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Sameer Agarwal >>>>>> Computer Science | UC Berkeley >>>>>> http://cs.berkeley.edu/~sameerag >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Sameer Agarwal >>>>> Computer Science | UC Berkeley >>>>> http://cs.berkeley.edu/~sameerag >>>>> >>>>> >>>> >> >> >> -- >> Takuya UESHIN >> Tokyo, Japan >> >> http://twitter.com/ueshin >> > >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 too 2018-02-20 14:41 GMT+09:00 Takuya UESHIN : > +1 > > > On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang > wrote: > >> +1 >> >> >> Wenchen Fan 于2018年2月20日 周二下午1:09写道: >> >>> +1 >>> >>> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >>> wrote: >>> >>>> +1 >>>> >>>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , >>>> wrote: >>>> >>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>> >>>> >>>> I've now deleted this file >>>> >>>> *From:* Sameer Agarwal >>>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>>> *To:* Sameer Agarwal >>>>> *Cc:* dev >>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>> >>>>> I'll start with a +1 once again. >>>>> >>>>> All blockers reported against RC3 have been resolved and the builds >>>>> are healthy. >>>>> >>>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>>> wrote: >>>>> >>>>>> Please vote on releasing the following candidate as Apache Spark >>>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>>> 8:00:00 >>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>>> >>>>>> >>>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>>> >>>>>> [ ] -1 Do not release this package because ... >>>>>> >>>>>> >>>>>> To learn more about Apache Spark, please see >>>>>> https://spark.apache.org/ >>>>>> >>>>>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar >>>>>> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >>>>>> >>>>>> List of JIRA tickets resolved in this release can be found here: >>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>>> >>>>>> The release files, including signatures, digests, etc. can be found >>>>>> at: >>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>>> >>>>>> Release artifacts are signed with the following key: >>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>>>> >>>>>> The staging repository for this release can be found at: >>>>>> https://repository.apache.org/content/repositories/orgapache >>>>>> spark-1265/ >>>>>> >>>>>> The documentation corresponding to this release can be found at: >>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>>> /_site/index.html >>>>>> >>>>>> >>>>>> FAQ >>>>>> >>>>>> === >>>>>> What are the unresolved issues targeted for 2.3.0? >>>>>> === >>>>>> >>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there >>>>>> are currently no known release blockers. >>>>>> >>>>>> = >>>>>> How can I help test this release? >>>>>> = >>>>>> >>>>>> If you are a Spark user, you can help us test this release by taking >>>>>> an existing Spark workload and running on this release candidate, then >>>>>> reporting any regressions. >>>>>> >>>>>> If you're working in PySpark you can set up a virtual env and install >>>>>> the current RC and see if anything important breaks, in the Java/Scala >>>>>> you >>>>>> can add the staging repository to your projects resolvers and test with >>>>>> the >>>>>> RC (make sure to clean up the artifact cache before/after so you don't >>>>>> end >>>>>> up building with a out of date RC going forward). >>>>>> >>>>>> === >>>>>> What should happen to JIRA tickets still targeting 2.3.0? >>>>>> === >>>>>> >>>>>> Committers should look at those and triage. Extremely important bug >>>>>> fixes, documentation, and API tweaks that impact compatibility should be >>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 >>>>>> as >>>>>> appropriate. >>>>>> >>>>>> === >>>>>> Why is my bug not fixed? >>>>>> === >>>>>> >>>>>> In order to make timely releases, we will typically not hold the >>>>>> release unless the bug in question is a regression from 2.2.0. That being >>>>>> said, if there is something which is a regression from 2.2.0 and has not >>>>>> been correctly targeted please ping me or a committer to help target the >>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at >>>>>> https://s.apache.org/WmoI). >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Sameer Agarwal >>>>> Computer Science | UC Berkeley >>>>> http://cs.berkeley.edu/~sameerag >>>>> >>>> >>>> >>>> >>>> -- >>>> Sameer Agarwal >>>> Computer Science | UC Berkeley >>>> http://cs.berkeley.edu/~sameerag >>>> >>>> >>> > > > -- > Takuya UESHIN > Tokyo, Japan > > http://twitter.com/ueshin >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 On Tue, Feb 20, 2018 at 2:14 PM, Xingbo Jiang wrote: > +1 > > > Wenchen Fan 于2018年2月20日 周二下午1:09写道: > >> +1 >> >> On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin >> wrote: >> >>> +1 >>> >>> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , >>> wrote: >>> >>> this file shouldn't be included? https://dist.apache.org/repos/ >>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>> >>> >>> I've now deleted this file >>> >>> *From:* Sameer Agarwal >>>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>>> *To:* Sameer Agarwal >>>> *Cc:* dev >>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>> >>>> I'll start with a +1 once again. >>>> >>>> All blockers reported against RC3 have been resolved and the builds are >>>> healthy. >>>> >>>> On 17 February 2018 at 13:41, Sameer Agarwal >>>> wrote: >>>> >>>>> Please vote on releasing the following candidate as Apache Spark >>>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at >>>>> 8:00:00 >>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>>> >>>>> >>>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>>> >>>>> [ ] -1 Do not release this package because ... >>>>> >>>>> >>>>> To learn more about Apache Spark, please see https://spark.apache.org/ >>>>> >>>>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/ >>>>> spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >>>>> >>>>> List of JIRA tickets resolved in this release can be found here: >>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>>> >>>>> The release files, including signatures, digests, etc. can be found at: >>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>>> >>>>> Release artifacts are signed with the following key: >>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>>> >>>>> The staging repository for this release can be found at: >>>>> https://repository.apache.org/content/repositories/ >>>>> orgapachespark-1265/ >>>>> >>>>> The documentation corresponding to this release can be found at: >>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- >>>>> docs/_site/index.html >>>>> >>>>> >>>>> FAQ >>>>> >>>>> === >>>>> What are the unresolved issues targeted for 2.3.0? >>>>> === >>>>> >>>>> Please see https://s.apache.org/oXKi. At the time of writing, there >>>>> are currently no known release blockers. >>>>> >>>>> = >>>>> How can I help test this release? >>>>> = >>>>> >>>>> If you are a Spark user, you can help us test this release by taking >>>>> an existing Spark workload and running on this release candidate, then >>>>> reporting any regressions. >>>>> >>>>> If you're working in PySpark you can set up a virtual env and install >>>>> the current RC and see if anything important breaks, in the Java/Scala you >>>>> can add the staging repository to your projects resolvers and test with >>>>> the >>>>> RC (make sure to clean up the artifact cache before/after so you don't end >>>>> up building with a out of date RC going forward). >>>>> >>>>> === >>>>> What should happen to JIRA tickets still targeting 2.3.0? >>>>> === >>>>> >>>>> Committers should look at those and triage. Extremely important bug >>>>> fixes, documentation, and API tweaks that impact compatibility should be >>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 >>>>> as >>>>> appropriate. >>>>> >>>>> === >>>>> Why is my bug not fixed? >>>>> === >>>>> >>>>> In order to make timely releases, we will typically not hold the >>>>> release unless the bug in question is a regression from 2.2.0. That being >>>>> said, if there is something which is a regression from 2.2.0 and has not >>>>> been correctly targeted please ping me or a committer to help target the >>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at >>>>> https://s.apache.org/WmoI). >>>>> >>>> >>>> >>>> >>>> -- >>>> Sameer Agarwal >>>> Computer Science | UC Berkeley >>>> http://cs.berkeley.edu/~sameerag >>>> >>> >>> >>> >>> -- >>> Sameer Agarwal >>> Computer Science | UC Berkeley >>> http://cs.berkeley.edu/~sameerag >>> >>> >> -- Takuya UESHIN Tokyo, Japan http://twitter.com/ueshin
Re: [VOTE] Spark 2.3.0 (RC4)
+1 Wenchen Fan 于2018年2月20日 周二下午1:09写道: > +1 > > On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin wrote: > >> +1 >> >> On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , >> wrote: >> >> this file shouldn't be included? >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>> >> >> I've now deleted this file >> >> *From:* Sameer Agarwal >>> *Sent:* Saturday, February 17, 2018 1:43:39 PM >>> *To:* Sameer Agarwal >>> *Cc:* dev >>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>> >>> I'll start with a +1 once again. >>> >>> All blockers reported against RC3 have been resolved and the builds are >>> healthy. >>> >>> On 17 February 2018 at 13:41, Sameer Agarwal >>> wrote: >>> >>>> Please vote on releasing the following candidate as Apache Spark >>>> version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 >>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast. >>>> >>>> >>>> [ ] +1 Release this package as Apache Spark 2.3.0 >>>> >>>> [ ] -1 Do not release this package because ... >>>> >>>> >>>> To learn more about Apache Spark, please see https://spark.apache.org/ >>>> >>>> The tag to be voted on is v2.3.0-rc4: >>>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>>> (44095cb65500739695b0324c177c19dfa1471472) >>>> >>>> List of JIRA tickets resolved in this release can be found here: >>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>>> >>>> The release files, including signatures, digests, etc. can be found at: >>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>>> >>>> Release artifacts are signed with the following key: >>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>> >>>> The staging repository for this release can be found at: >>>> https://repository.apache.org/content/repositories/orgapachespark-1265/ >>>> >>>> The documentation corresponding to this release can be found at: >>>> >>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html >>>> >>>> >>>> FAQ >>>> >>>> === >>>> What are the unresolved issues targeted for 2.3.0? >>>> === >>>> >>>> Please see https://s.apache.org/oXKi. At the time of writing, there >>>> are currently no known release blockers. >>>> >>>> = >>>> How can I help test this release? >>>> = >>>> >>>> If you are a Spark user, you can help us test this release by taking an >>>> existing Spark workload and running on this release candidate, then >>>> reporting any regressions. >>>> >>>> If you're working in PySpark you can set up a virtual env and install >>>> the current RC and see if anything important breaks, in the Java/Scala you >>>> can add the staging repository to your projects resolvers and test with the >>>> RC (make sure to clean up the artifact cache before/after so you don't end >>>> up building with a out of date RC going forward). >>>> >>>> === >>>> What should happen to JIRA tickets still targeting 2.3.0? >>>> === >>>> >>>> Committers should look at those and triage. Extremely important bug >>>> fixes, documentation, and API tweaks that impact compatibility should be >>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as >>>> appropriate. >>>> >>>> === >>>> Why is my bug not fixed? >>>> === >>>> >>>> In order to make timely releases, we will typically not hold the >>>> release unless the bug in question is a regression from 2.2.0. That being >>>> said, if there is something which is a regression from 2.2.0 and has not >>>> been correctly targeted please ping me or a committer to help target the >>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at >>>> https://s.apache.org/WmoI). >>>> >>> >>> >>> >>> -- >>> Sameer Agarwal >>> Computer Science | UC Berkeley >>> http://cs.berkeley.edu/~sameerag >>> >> >> >> >> -- >> Sameer Agarwal >> Computer Science | UC Berkeley >> http://cs.berkeley.edu/~sameerag >> >> >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin wrote: > +1 > > On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , > wrote: > > this file shouldn't be included? https://dist.apache.org/repos/ >> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >> > > I've now deleted this file > > *From:* Sameer Agarwal >> *Sent:* Saturday, February 17, 2018 1:43:39 PM >> *To:* Sameer Agarwal >> *Cc:* dev >> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >> >> I'll start with a +1 once again. >> >> All blockers reported against RC3 have been resolved and the builds are >> healthy. >> >> On 17 February 2018 at 13:41, Sameer Agarwal wrote: >> >>> Please vote on releasing the following candidate as Apache Spark version >>> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC >>> and passes if a majority of at least 3 PMC +1 votes are cast. >>> >>> >>> [ ] +1 Release this package as Apache Spark 2.3.0 >>> >>> [ ] -1 Do not release this package because ... >>> >>> >>> To learn more about Apache Spark, please see https://spark.apache.org/ >>> >>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar >>> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >>> >>> List of JIRA tickets resolved in this release can be found here: >>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>> >>> The release files, including signatures, digests, etc. can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>> >>> Release artifacts are signed with the following key: >>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>> >>> The staging repository for this release can be found at: >>> https://repository.apache.org/content/repositories/orgapachespark-1265/ >>> >>> The documentation corresponding to this release can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>> /_site/index.html >>> >>> >>> FAQ >>> >>> === >>> What are the unresolved issues targeted for 2.3.0? >>> === >>> >>> Please see https://s.apache.org/oXKi. At the time of writing, there are >>> currently no known release blockers. >>> >>> = >>> How can I help test this release? >>> = >>> >>> If you are a Spark user, you can help us test this release by taking an >>> existing Spark workload and running on this release candidate, then >>> reporting any regressions. >>> >>> If you're working in PySpark you can set up a virtual env and install >>> the current RC and see if anything important breaks, in the Java/Scala you >>> can add the staging repository to your projects resolvers and test with the >>> RC (make sure to clean up the artifact cache before/after so you don't end >>> up building with a out of date RC going forward). >>> >>> === >>> What should happen to JIRA tickets still targeting 2.3.0? >>> === >>> >>> Committers should look at those and triage. Extremely important bug >>> fixes, documentation, and API tweaks that impact compatibility should be >>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as >>> appropriate. >>> >>> === >>> Why is my bug not fixed? >>> === >>> >>> In order to make timely releases, we will typically not hold the release >>> unless the bug in question is a regression from 2.2.0. That being said, if >>> there is something which is a regression from 2.2.0 and has not been >>> correctly targeted please ping me or a committer to help target the issue >>> (you can see the open issues listed as impacting Spark 2.3.0 at >>> https://s.apache.org/WmoI). >>> >> >> >> >> -- >> Sameer Agarwal >> Computer Science | UC Berkeley >> http://cs.berkeley.edu/~sameerag >> > > > > -- > Sameer Agarwal > Computer Science | UC Berkeley > http://cs.berkeley.edu/~sameerag > >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal , wrote: > > > this file shouldn't be included? > > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml > > > > I've now deleted this file > > > > > From: Sameer Agarwal > > > Sent: Saturday, February 17, 2018 1:43:39 PM > > > To: Sameer Agarwal > > > Cc: dev > > > Subject: Re: [VOTE] Spark 2.3.0 (RC4) > > > > > > I'll start with a +1 once again. > > > > > > All blockers reported against RC3 have been resolved and the builds are > > > healthy. > > > > > > > On 17 February 2018 at 13:41, Sameer Agarwal > > > > wrote: > > > > > Please vote on releasing the following candidate as Apache Spark > > > > > version 2.3.0. The vote is open until Thursday February 22, 2018 at > > > > > 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes > > > > > are cast. > > > > > > > > > > > > > > > [ ] +1 Release this package as Apache Spark 2.3.0 > > > > > > > > > > [ ] -1 Do not release this package because ... > > > > > > > > > > > > > > > To learn more about Apache Spark, please see https://spark.apache.org/ > > > > > > > > > > The tag to be voted on is v2.3.0-rc4: > > > > > https://github.com/apache/spark/tree/v2.3.0-rc4 > > > > > (44095cb65500739695b0324c177c19dfa1471472) > > > > > > > > > > List of JIRA tickets resolved in this release can be found here: > > > > > https://issues.apache.org/jira/projects/SPARK/versions/12339551 > > > > > > > > > > The release files, including signatures, digests, etc. can be found > > > > > at: > > > > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ > > > > > > > > > > Release artifacts are signed with the following key: > > > > > https://dist.apache.org/repos/dist/dev/spark/KEYS > > > > > > > > > > The staging repository for this release can be found at: > > > > > https://repository.apache.org/content/repositories/orgapachespark-1265/ > > > > > > > > > > The documentation corresponding to this release can be found at: > > > > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html > > > > > > > > > > > > > > > FAQ > > > > > > > > > > === > > > > > What are the unresolved issues targeted for 2.3.0? > > > > > === > > > > > > > > > > Please see https://s.apache.org/oXKi. At the time of writing, there > > > > > are currently no known release blockers. > > > > > > > > > > = > > > > > How can I help test this release? > > > > > = > > > > > > > > > > If you are a Spark user, you can help us test this release by taking > > > > > an existing Spark workload and running on this release candidate, > > > > > then reporting any regressions. > > > > > > > > > > If you're working in PySpark you can set up a virtual env and install > > > > > the current RC and see if anything important breaks, in the > > > > > Java/Scala you can add the staging repository to your projects > > > > > resolvers and test with the RC (make sure to clean up the artifact > > > > > cache before/after so you don't end up building with a out of date RC > > > > > going forward). > > > > > > > > > > === > > > > > What should happen to JIRA tickets still targeting 2.3.0? > > > > > === > > > > > > > > > > Committers should look at those and triage. Extremely important bug > > > > > fixes, documentation, and API tweaks that impact compatibility should > > > > > be worked on immediately. Everything else please retarget to 2.3.1 or > > > > > 2.4.0 as appropriate. > > > > > > > > > > === > > > > > Why is my bug not fixed? > > > > > === > > > > > > > > > > In order to make timely releases, we will typically not hold the > > > > > release unless the bug in question is a regression from 2.2.0. That > > > > > being said, if there is something which is a regression from 2.2.0 > > > > > and has not been correctly targeted please ping me or a committer to > > > > > help target the issue (you can see the open issues listed as > > > > > impacting Spark 2.3.0 at https://s.apache.org/WmoI). > > > > > > > > > > > > -- > > > Sameer Agarwal > > > Computer Science | UC Berkeley > > > http://cs.berkeley.edu/~sameerag > > > > -- > Sameer Agarwal > Computer Science | UC Berkeley > http://cs.berkeley.edu/~sameerag
Re: [VOTE] Spark 2.3.0 (RC4)
> > this file shouldn't be included? https://dist.apache.org/repos/ > dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml > I've now deleted this file *From:* Sameer Agarwal > *Sent:* Saturday, February 17, 2018 1:43:39 PM > *To:* Sameer Agarwal > *Cc:* dev > *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) > > I'll start with a +1 once again. > > All blockers reported against RC3 have been resolved and the builds are > healthy. > > On 17 February 2018 at 13:41, Sameer Agarwal wrote: > >> Please vote on releasing the following candidate as Apache Spark version >> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC >> and passes if a majority of at least 3 PMC +1 votes are cast. >> >> >> [ ] +1 Release this package as Apache Spark 2.3.0 >> >> [ ] -1 Do not release this package because ... >> >> >> To learn more about Apache Spark, please see https://spark.apache.org/ >> >> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spar >> k/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >> >> List of JIRA tickets resolved in this release can be found here: >> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >> >> The release files, including signatures, digests, etc. can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >> >> Release artifacts are signed with the following key: >> https://dist.apache.org/repos/dist/dev/spark/KEYS >> >> The staging repository for this release can be found at: >> https://repository.apache.org/content/repositories/orgapachespark-1265/ >> >> The documentation corresponding to this release can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >> /_site/index.html >> >> >> FAQ >> >> === >> What are the unresolved issues targeted for 2.3.0? >> === >> >> Please see https://s.apache.org/oXKi. At the time of writing, there are >> currently no known release blockers. >> >> = >> How can I help test this release? >> = >> >> If you are a Spark user, you can help us test this release by taking an >> existing Spark workload and running on this release candidate, then >> reporting any regressions. >> >> If you're working in PySpark you can set up a virtual env and install the >> current RC and see if anything important breaks, in the Java/Scala you can >> add the staging repository to your projects resolvers and test with the RC >> (make sure to clean up the artifact cache before/after so you don't end up >> building with a out of date RC going forward). >> >> === >> What should happen to JIRA tickets still targeting 2.3.0? >> === >> >> Committers should look at those and triage. Extremely important bug >> fixes, documentation, and API tweaks that impact compatibility should be >> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as >> appropriate. >> >> === >> Why is my bug not fixed? >> === >> >> In order to make timely releases, we will typically not hold the release >> unless the bug in question is a regression from 2.2.0. That being said, if >> there is something which is a regression from 2.2.0 and has not been >> correctly targeted please ping me or a committer to help target the issue >> (you can see the open issues listed as impacting Spark 2.3.0 at >> https://s.apache.org/WmoI). >> > > > > -- > Sameer Agarwal > Computer Science | UC Berkeley > http://cs.berkeley.edu/~sameerag > -- Sameer Agarwal Computer Science | UC Berkeley http://cs.berkeley.edu/~sameerag
Re: [VOTE] Spark 2.3.0 (RC4)
In addition to Hyukjin's `github.io` result, `jekyll` also forwards the search result links correctly. SKIP_SCALADOC=1 SKIP_PYTHONDOC=1 SKIP_RDOC=1 jekyll serve --watch And, connect `http://127.0.0.1:4000`. This will be the same in Apache Spark websites. Bests, Dongjoon. On Mon, Feb 19, 2018 at 8:37 PM, vaquar khan wrote: > +1 > > Regards, > Vaquar khan > > On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li wrote: > >> +1. >> >> So far, no function/performance regression in Spark SQL, Core and >> PySpark. >> >> Thanks! >> >> Xiao >> >> 2018-02-19 19:47 GMT-08:00 Hyukjin Kwon : >> >>> Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee >>> this when I added this documentation because it worked in my simple demo: >>> >>> https://spark-test.github.io/sparksqldoc/search.html?q=approx >>> https://spark-test.github.io/sparksqldoc/#approx_percentile >>> >>> Will try to investigate this shortly too. >>> >>> >>> >>> 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman < >>> shiva...@eecs.berkeley.edu>: >>> >>>> For (1) I think it has something to do with https://dist.apache.org/r >>>> epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically >>>> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-d >>>> ocs/_site/api/sql/index.html -- So if you see the link to >>>> approx_percentile the link we generate is https://dist.apache.org/rep >>>> os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile -- >>>> This doesn't work as Felix said but https://dist.apache.org/re >>>> pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html# >>>> approx_percentile works >>>> >>>> I'm not sure how this will behave on the main site. FWIW >>>> http://spark.apache.org/docs/latest/api/python/ does redirect to >>>> http://spark.apache.org/docs/latest/api/python/index.html >>>> >>>> Thanks >>>> Shivaram >>>> >>>> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung < >>>> felixcheun...@hotmail.com> wrote: >>>> >>>>> Ah sorry I realize my wordings were unclear (not enough zzz or coffee) >>>>> >>>>> So to clarify, >>>>> 1) when searching for a word in the Sql function doc, it does return >>>>> that search result page correctly, however, none of the link in result >>>>> opens to the actual doc page, so to take the search I included as an >>>>> example, if you click on approx_percentile, for instance, it brings open >>>>> the web directory instead. >>>>> >>>>> 2) The second is the dist location we are voting on has a .iml file, >>>>> which is normally not included in release or release RC and it is unsigned >>>>> and without hash (therefore seems like it should not be in the release) >>>>> >>>>> Thanks! >>>>> >>>>> _________ >>>>> From: Shivaram Venkataraman >>>>> Sent: Tuesday, February 20, 2018 2:24 AM >>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>>> To: Felix Cheung >>>>> Cc: Sean Owen , dev >>>>> >>>>> >>>>> >>>>> FWIW The search result link works for me >>>>> >>>>> Shivaram >>>>> >>>>> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung < >>>>> felixcheun...@hotmail.com> wrote: >>>>> >>>>>> These are two separate things: >>>>>> >>>>>> Does the search result links work for you? >>>>>> >>>>>> The second is the dist location we are voting on has a .iml file. >>>>>> >>>>>> _ >>>>>> From: Sean Owen >>>>>> Sent: Tuesday, February 20, 2018 2:19 AM >>>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>>>> To: Felix Cheung >>>>>> Cc: dev >>>>>> >>>>>> >>>>>> >>>>>> Maybe I misunderstand, but I don't see any .iml file in the 4 results >>>>>> on that page? it looks reasonable. >>>>>> >>>>>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung < >>>>>> felixcheun...@hotmail.com> wrote: >>>>>> >>>>>>> Any idea with sql func docs search result returning broken links as >>>>>>> below? >>>>>>> >>>>>>> *From:* Felix Cheung >>>>>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM >>>>>>> *To:* Sameer Agarwal; Sameer Agarwal >>>>>>> >>>>>>> *Cc:* dev >>>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>>> Quick questions: >>>>>>> >>>>>>> is there search link for sql functions quite right? >>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>>>> /_site/api/sql/search.html?q=app >>>>>>> >>>>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> >>>> >>> >> > > > -- > Regards, > Vaquar Khan > +1 -224-436-0783 > Greater Chicago >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 Regards, Vaquar khan On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li wrote: > +1. > > So far, no function/performance regression in Spark SQL, Core and PySpark. > > Thanks! > > Xiao > > 2018-02-19 19:47 GMT-08:00 Hyukjin Kwon : > >> Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee >> this when I added this documentation because it worked in my simple demo: >> >> https://spark-test.github.io/sparksqldoc/search.html?q=approx >> https://spark-test.github.io/sparksqldoc/#approx_percentile >> >> Will try to investigate this shortly too. >> >> >> >> 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman < >> shiva...@eecs.berkeley.edu>: >> >>> For (1) I think it has something to do with https://dist.apache.org/r >>> epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically >>> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-d >>> ocs/_site/api/sql/index.html -- So if you see the link to >>> approx_percentile the link we generate is https://dist.apache.org/rep >>> os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile -- >>> This doesn't work as Felix said but https://dist.apache.org/re >>> pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html# >>> approx_percentile works >>> >>> I'm not sure how this will behave on the main site. FWIW >>> http://spark.apache.org/docs/latest/api/python/ does redirect to >>> http://spark.apache.org/docs/latest/api/python/index.html >>> >>> Thanks >>> Shivaram >>> >>> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung >> > wrote: >>> >>>> Ah sorry I realize my wordings were unclear (not enough zzz or coffee) >>>> >>>> So to clarify, >>>> 1) when searching for a word in the Sql function doc, it does return >>>> that search result page correctly, however, none of the link in result >>>> opens to the actual doc page, so to take the search I included as an >>>> example, if you click on approx_percentile, for instance, it brings open >>>> the web directory instead. >>>> >>>> 2) The second is the dist location we are voting on has a .iml file, >>>> which is normally not included in release or release RC and it is unsigned >>>> and without hash (therefore seems like it should not be in the release) >>>> >>>> Thanks! >>>> >>>> _ >>>> From: Shivaram Venkataraman >>>> Sent: Tuesday, February 20, 2018 2:24 AM >>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>> To: Felix Cheung >>>> Cc: Sean Owen , dev >>>> >>>> >>>> >>>> FWIW The search result link works for me >>>> >>>> Shivaram >>>> >>>> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung < >>>> felixcheun...@hotmail.com> wrote: >>>> >>>>> These are two separate things: >>>>> >>>>> Does the search result links work for you? >>>>> >>>>> The second is the dist location we are voting on has a .iml file. >>>>> >>>>> _ >>>>> From: Sean Owen >>>>> Sent: Tuesday, February 20, 2018 2:19 AM >>>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>>> To: Felix Cheung >>>>> Cc: dev >>>>> >>>>> >>>>> >>>>> Maybe I misunderstand, but I don't see any .iml file in the 4 results >>>>> on that page? it looks reasonable. >>>>> >>>>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung < >>>>> felixcheun...@hotmail.com> wrote: >>>>> >>>>>> Any idea with sql func docs search result returning broken links as >>>>>> below? >>>>>> >>>>>> *From:* Felix Cheung >>>>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM >>>>>> *To:* Sameer Agarwal; Sameer Agarwal >>>>>> >>>>>> *Cc:* dev >>>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>>> Quick questions: >>>>>> >>>>>> is there search link for sql functions quite right? >>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>>> /_site/api/sql/search.html?q=app >>>>>> >>>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>>> >>>>>> >>>>> >>>>> >>>> >>>> >>>> >>> >> > -- Regards, Vaquar Khan +1 -224-436-0783 Greater Chicago
Re: [VOTE] Spark 2.3.0 (RC4)
+1. So far, no function/performance regression in Spark SQL, Core and PySpark. Thanks! Xiao 2018-02-19 19:47 GMT-08:00 Hyukjin Kwon : > Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee > this when I added this documentation because it worked in my simple demo: > > https://spark-test.github.io/sparksqldoc/search.html?q=approx > https://spark-test.github.io/sparksqldoc/#approx_percentile > > Will try to investigate this shortly too. > > > > 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman < > shiva...@eecs.berkeley.edu>: > >> For (1) I think it has something to do with https://dist.apache.org/r >> epos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically >> going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- >> docs/_site/api/sql/index.html -- So if you see the link to >> approx_percentile the link we generate is https://dist.apache.org/rep >> os/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile -- >> This doesn't work as Felix said but https://dist.apache.org/re >> pos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html# >> approx_percentile works >> >> I'm not sure how this will behave on the main site. FWIW >> http://spark.apache.org/docs/latest/api/python/ does redirect to >> http://spark.apache.org/docs/latest/api/python/index.html >> >> Thanks >> Shivaram >> >> On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung >> wrote: >> >>> Ah sorry I realize my wordings were unclear (not enough zzz or coffee) >>> >>> So to clarify, >>> 1) when searching for a word in the Sql function doc, it does return >>> that search result page correctly, however, none of the link in result >>> opens to the actual doc page, so to take the search I included as an >>> example, if you click on approx_percentile, for instance, it brings open >>> the web directory instead. >>> >>> 2) The second is the dist location we are voting on has a .iml file, >>> which is normally not included in release or release RC and it is unsigned >>> and without hash (therefore seems like it should not be in the release) >>> >>> Thanks! >>> >>> _ >>> From: Shivaram Venkataraman >>> Sent: Tuesday, February 20, 2018 2:24 AM >>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>> To: Felix Cheung >>> Cc: Sean Owen , dev >>> >>> >>> >>> FWIW The search result link works for me >>> >>> Shivaram >>> >>> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung >> > wrote: >>> >>>> These are two separate things: >>>> >>>> Does the search result links work for you? >>>> >>>> The second is the dist location we are voting on has a .iml file. >>>> >>>> _ >>>> From: Sean Owen >>>> Sent: Tuesday, February 20, 2018 2:19 AM >>>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>>> To: Felix Cheung >>>> Cc: dev >>>> >>>> >>>> >>>> Maybe I misunderstand, but I don't see any .iml file in the 4 results >>>> on that page? it looks reasonable. >>>> >>>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung >>>> wrote: >>>> >>>>> Any idea with sql func docs search result returning broken links as >>>>> below? >>>>> >>>>> *From:* Felix Cheung >>>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM >>>>> *To:* Sameer Agarwal; Sameer Agarwal >>>>> >>>>> *Cc:* dev >>>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>>> Quick questions: >>>>> >>>>> is there search link for sql functions quite right? >>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>>> /_site/api/sql/search.html?q=app >>>>> >>>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>>> >>>>> >>>> >>>> >>> >>> >>> >> >
Re: [VOTE] Spark 2.3.0 (RC4)
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this when I added this documentation because it worked in my simple demo: https://spark-test.github.io/sparksqldoc/search.html?q=approx https://spark-test.github.io/sparksqldoc/#approx_percentile Will try to investigate this shortly too. 2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman : > For (1) I think it has something to do with https://dist.apache.org/ > repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically > going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0- > rc4-docs/_site/api/sql/index.html -- So if you see the link to > approx_percentile the link we generate is https://dist.apache.org/ > repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile -- > This doesn't work as Felix said but https://dist.apache.org/ > repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index. > html#approx_percentile works > > I'm not sure how this will behave on the main site. FWIW > http://spark.apache.org/docs/latest/api/python/ does redirect to > http://spark.apache.org/docs/latest/api/python/index.html > > Thanks > Shivaram > > On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung > wrote: > >> Ah sorry I realize my wordings were unclear (not enough zzz or coffee) >> >> So to clarify, >> 1) when searching for a word in the Sql function doc, it does return that >> search result page correctly, however, none of the link in result opens to >> the actual doc page, so to take the search I included as an example, if you >> click on approx_percentile, for instance, it brings open the web directory >> instead. >> >> 2) The second is the dist location we are voting on has a .iml file, >> which is normally not included in release or release RC and it is unsigned >> and without hash (therefore seems like it should not be in the release) >> >> Thanks! >> >> _ >> From: Shivaram Venkataraman >> Sent: Tuesday, February 20, 2018 2:24 AM >> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >> To: Felix Cheung >> Cc: Sean Owen , dev >> >> >> >> FWIW The search result link works for me >> >> Shivaram >> >> On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung >> wrote: >> >>> These are two separate things: >>> >>> Does the search result links work for you? >>> >>> The second is the dist location we are voting on has a .iml file. >>> >>> _ >>> From: Sean Owen >>> Sent: Tuesday, February 20, 2018 2:19 AM >>> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >>> To: Felix Cheung >>> Cc: dev >>> >>> >>> >>> Maybe I misunderstand, but I don't see any .iml file in the 4 results on >>> that page? it looks reasonable. >>> >>> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung >>> wrote: >>> >>>> Any idea with sql func docs search result returning broken links as >>>> below? >>>> >>>> *From:* Felix Cheung >>>> *Sent:* Sunday, February 18, 2018 10:05:22 AM >>>> *To:* Sameer Agarwal; Sameer Agarwal >>>> >>>> *Cc:* dev >>>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>>> Quick questions: >>>> >>>> is there search link for sql functions quite right? >>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>>> /_site/api/sql/search.html?q=app >>>> >>>> this file shouldn't be included? https://dist.apache.org/repos/ >>>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>>> >>>> >>> >>> >> >> >> >
Re: [VOTE] Spark 2.3.0 (RC4)
For (1) I think it has something to do with https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/ not automatically going to https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html -- So if you see the link to approx_percentile the link we generate is https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/#approx_percentile -- This doesn't work as Felix said but https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/index.html#approx_percentile works I'm not sure how this will behave on the main site. FWIW http://spark.apache.org/docs/latest/api/python/ does redirect to http://spark.apache.org/docs/latest/api/python/index.html Thanks Shivaram On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung wrote: > Ah sorry I realize my wordings were unclear (not enough zzz or coffee) > > So to clarify, > 1) when searching for a word in the Sql function doc, it does return that > search result page correctly, however, none of the link in result opens to > the actual doc page, so to take the search I included as an example, if you > click on approx_percentile, for instance, it brings open the web directory > instead. > > 2) The second is the dist location we are voting on has a .iml file, which > is normally not included in release or release RC and it is unsigned and > without hash (therefore seems like it should not be in the release) > > Thanks! > > _ > From: Shivaram Venkataraman > Sent: Tuesday, February 20, 2018 2:24 AM > Subject: Re: [VOTE] Spark 2.3.0 (RC4) > To: Felix Cheung > Cc: Sean Owen , dev > > > > FWIW The search result link works for me > > Shivaram > > On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung > wrote: > >> These are two separate things: >> >> Does the search result links work for you? >> >> The second is the dist location we are voting on has a .iml file. >> >> _ >> From: Sean Owen >> Sent: Tuesday, February 20, 2018 2:19 AM >> Subject: Re: [VOTE] Spark 2.3.0 (RC4) >> To: Felix Cheung >> Cc: dev >> >> >> >> Maybe I misunderstand, but I don't see any .iml file in the 4 results on >> that page? it looks reasonable. >> >> On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung >> wrote: >> >>> Any idea with sql func docs search result returning broken links as >>> below? >>> >>> *From:* Felix Cheung >>> *Sent:* Sunday, February 18, 2018 10:05:22 AM >>> *To:* Sameer Agarwal; Sameer Agarwal >>> >>> *Cc:* dev >>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >>> Quick questions: >>> >>> is there search link for sql functions quite right? >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs >>> /_site/api/sql/search.html?q=app >>> >>> this file shouldn't be included? https://dist.apache.org/repos/ >>> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >>> >>> >> >> > > >
Re: [VOTE] Spark 2.3.0 (RC4)
Ah sorry I realize my wordings were unclear (not enough zzz or coffee) So to clarify, 1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead. 2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release) Thanks! _ From: Shivaram Venkataraman Sent: Tuesday, February 20, 2018 2:24 AM Subject: Re: [VOTE] Spark 2.3.0 (RC4) To: Felix Cheung Cc: Sean Owen , dev FWIW The search result link works for me Shivaram On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung mailto:felixcheun...@hotmail.com>> wrote: These are two separate things: Does the search result links work for you? The second is the dist location we are voting on has a .iml file. _ From: Sean Owen mailto:sro...@gmail.com>> Sent: Tuesday, February 20, 2018 2:19 AM Subject: Re: [VOTE] Spark 2.3.0 (RC4) To: Felix Cheung mailto:felixcheun...@hotmail.com>> Cc: dev mailto:dev@spark.apache.org>> Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable. On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung mailto:felixcheun...@hotmail.com>> wrote: Any idea with sql func docs search result returning broken links as below? From: Felix Cheung mailto:felixcheun...@hotmail.com>> Sent: Sunday, February 18, 2018 10:05:22 AM To: Sameer Agarwal; Sameer Agarwal Cc: dev Subject: Re: [VOTE] Spark 2.3.0 (RC4) Quick questions: is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
Re: [VOTE] Spark 2.3.0 (RC4)
FWIW The search result link works for me Shivaram On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung wrote: > These are two separate things: > > Does the search result links work for you? > > The second is the dist location we are voting on has a .iml file. > > _ > From: Sean Owen > Sent: Tuesday, February 20, 2018 2:19 AM > Subject: Re: [VOTE] Spark 2.3.0 (RC4) > To: Felix Cheung > Cc: dev > > > > Maybe I misunderstand, but I don't see any .iml file in the 4 results on > that page? it looks reasonable. > > On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung > wrote: > >> Any idea with sql func docs search result returning broken links as below? >> >> *From:* Felix Cheung >> *Sent:* Sunday, February 18, 2018 10:05:22 AM >> *To:* Sameer Agarwal; Sameer Agarwal >> >> *Cc:* dev >> *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) >> Quick questions: >> >> is there search link for sql functions quite right? >> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- >> docs/_site/api/sql/search.html?q=app >> >> this file shouldn't be included? https://dist.apache.org/repos/ >> dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml >> >> > >
Re: [VOTE] Spark 2.3.0 (RC4)
These are two separate things: Does the search result links work for you? The second is the dist location we are voting on has a .iml file. _ From: Sean Owen Sent: Tuesday, February 20, 2018 2:19 AM Subject: Re: [VOTE] Spark 2.3.0 (RC4) To: Felix Cheung Cc: dev Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable. On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung mailto:felixcheun...@hotmail.com>> wrote: Any idea with sql func docs search result returning broken links as below? From: Felix Cheung mailto:felixcheun...@hotmail.com>> Sent: Sunday, February 18, 2018 10:05:22 AM To: Sameer Agarwal; Sameer Agarwal Cc: dev Subject: Re: [VOTE] Spark 2.3.0 (RC4) Quick questions: is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
Re: [VOTE] Spark 2.3.0 (RC4)
Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable. On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung wrote: > Any idea with sql func docs search result returning broken links as below? > > *From:* Felix Cheung > *Sent:* Sunday, February 18, 2018 10:05:22 AM > *To:* Sameer Agarwal; Sameer Agarwal > > *Cc:* dev > *Subject:* Re: [VOTE] Spark 2.3.0 (RC4) > Quick questions: > > is there search link for sql functions quite right? > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app > > this file shouldn't be included? > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml > >
Re: [VOTE] Spark 2.3.0 (RC4)
Any idea with sql func docs search result returning broken links as below? From: Felix Cheung Sent: Sunday, February 18, 2018 10:05:22 AM To: Sameer Agarwal; Sameer Agarwal Cc: dev Subject: Re: [VOTE] Spark 2.3.0 (RC4) Quick questions: is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml From: Sameer Agarwal Sent: Saturday, February 17, 2018 1:43:39 PM To: Sameer Agarwal Cc: dev Subject: Re: [VOTE] Spark 2.3.0 (RC4) I'll start with a +1 once again. All blockers reported against RC3 have been resolved and the builds are healthy. On 17 February 2018 at 13:41, Sameer Agarwal mailto:samee...@apache.org>> wrote: Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast. [ ] +1 Release this package as Apache Spark 2.3.0 [ ] -1 Do not release this package because ... To learn more about Apache Spark, please see https://spark.apache.org/ The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551 The release files, including signatures, digests, etc. can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ Release artifacts are signed with the following key: https://dist.apache.org/repos/dist/dev/spark/KEYS The staging repository for this release can be found at: https://repository.apache.org/content/repositories/orgapachespark-1265/ The documentation corresponding to this release can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html FAQ === What are the unresolved issues targeted for 2.3.0? === Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers. = How can I help test this release? = If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions. If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward). === What should happen to JIRA tickets still targeting 2.3.0? === Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate. === Why is my bug not fixed? === In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI). -- Sameer Agarwal Computer Science | UC Berkeley http://cs.berkeley.edu/~sameerag
Re: [VOTE] Spark 2.3.0 (RC4)
+1. I tested RC4 on CentOS 7.4 / OpenJDK 1.8.0_161 with `-Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Psparkr`. Bests, Dongjoon. On Sun, Feb 18, 2018 at 3:22 PM, Denny Lee wrote: > +1 (non-binding) > > Built and tested on macOS and Ubuntu. > > > On Sun, Feb 18, 2018 at 3:19 PM Ricardo Almeida < > ricardo.alme...@actnowib.com> wrote: > >> +1 (non-binding) >> >> Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No >> regressions detected so far. >> >> >> On 18 February 2018 at 16:12, Sean Owen wrote: >> >>> +1 from me as last time, same outcome. >>> >>> I saw one test fail, but passed on a second run, so just seems flaky. >>> >>> - subscribing topic by name from latest offsets (failOnDataLoss: true) >>> *** FAILED *** >>> Error while stopping stream: >>> query.exception() is not empty after clean stop: org.apache.spark.sql. >>> streaming.StreamingQueryException: Writing job failed. >>> === Streaming Query === >>> Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = >>> 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96] >>> Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: >>> {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}} >>> Current Available Offsets: {} >>> >>> Current State: TERMINATED >>> Thread State: RUNNABLE >>> >>> On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal >>> wrote: >>> Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast. [ ] +1 Release this package as Apache Spark 2.3.0 [ ] -1 Do not release this package because ... To learn more about Apache Spark, please see https://spark.apache.org/ The tag to be voted on is v2.3.0-rc4: https://github.com/apache/ spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551 The release files, including signatures, digests, etc. can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ Release artifacts are signed with the following key: https://dist.apache.org/repos/dist/dev/spark/KEYS The staging repository for this release can be found at: https://repository.apache.org/content/repositories/orgapachespark-1265/ The documentation corresponding to this release can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- docs/_site/index.html FAQ === What are the unresolved issues targeted for 2.3.0? === Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers. = How can I help test this release? = If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions. If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward). === What should happen to JIRA tickets still targeting 2.3.0? === Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate. === Why is my bug not fixed? === In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI). >>> >>
Re: [VOTE] Spark 2.3.0 (RC4)
+1 (non-binding) Built and tested on macOS and Ubuntu. On Sun, Feb 18, 2018 at 3:19 PM Ricardo Almeida < ricardo.alme...@actnowib.com> wrote: > +1 (non-binding) > > Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No > regressions detected so far. > > > On 18 February 2018 at 16:12, Sean Owen wrote: > >> +1 from me as last time, same outcome. >> >> I saw one test fail, but passed on a second run, so just seems flaky. >> >> - subscribing topic by name from latest offsets (failOnDataLoss: true) >> *** FAILED *** >> Error while stopping stream: >> query.exception() is not empty after clean stop: >> org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed. >> === Streaming Query === >> Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = >> 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96] >> Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: >> {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}} >> Current Available Offsets: {} >> >> Current State: TERMINATED >> Thread State: RUNNABLE >> >> On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal >> wrote: >> >>> Please vote on releasing the following candidate as Apache Spark version >>> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC >>> and passes if a majority of at least 3 PMC +1 votes are cast. >>> >>> >>> [ ] +1 Release this package as Apache Spark 2.3.0 >>> >>> [ ] -1 Do not release this package because ... >>> >>> >>> To learn more about Apache Spark, please see https://spark.apache.org/ >>> >>> The tag to be voted on is v2.3.0-rc4: >>> https://github.com/apache/spark/tree/v2.3.0-rc4 >>> (44095cb65500739695b0324c177c19dfa1471472) >>> >>> List of JIRA tickets resolved in this release can be found here: >>> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >>> >>> The release files, including signatures, digests, etc. can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >>> >>> Release artifacts are signed with the following key: >>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>> >>> The staging repository for this release can be found at: >>> https://repository.apache.org/content/repositories/orgapachespark-1265/ >>> >>> The documentation corresponding to this release can be found at: >>> >>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html >>> >>> >>> FAQ >>> >>> === >>> What are the unresolved issues targeted for 2.3.0? >>> === >>> >>> Please see https://s.apache.org/oXKi. At the time of writing, there are >>> currently no known release blockers. >>> >>> = >>> How can I help test this release? >>> = >>> >>> If you are a Spark user, you can help us test this release by taking an >>> existing Spark workload and running on this release candidate, then >>> reporting any regressions. >>> >>> If you're working in PySpark you can set up a virtual env and install >>> the current RC and see if anything important breaks, in the Java/Scala you >>> can add the staging repository to your projects resolvers and test with the >>> RC (make sure to clean up the artifact cache before/after so you don't end >>> up building with a out of date RC going forward). >>> >>> === >>> What should happen to JIRA tickets still targeting 2.3.0? >>> === >>> >>> Committers should look at those and triage. Extremely important bug >>> fixes, documentation, and API tweaks that impact compatibility should be >>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as >>> appropriate. >>> >>> === >>> Why is my bug not fixed? >>> === >>> >>> In order to make timely releases, we will typically not hold the release >>> unless the bug in question is a regression from 2.2.0. That being said, if >>> there is something which is a regression from 2.2.0 and has not been >>> correctly targeted please ping me or a committer to help target the issue >>> (you can see the open issues listed as impacting Spark 2.3.0 at >>> https://s.apache.org/WmoI). >>> >> >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 (non-binding) Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No regressions detected so far. On 18 February 2018 at 16:12, Sean Owen wrote: > +1 from me as last time, same outcome. > > I saw one test fail, but passed on a second run, so just seems flaky. > > - subscribing topic by name from latest offsets (failOnDataLoss: true) *** > FAILED *** > Error while stopping stream: > query.exception() is not empty after clean stop: org.apache.spark.sql. > streaming.StreamingQueryException: Writing job failed. > === Streaming Query === > Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = > 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96] > Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: > {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}} > Current Available Offsets: {} > > Current State: TERMINATED > Thread State: RUNNABLE > > On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal > wrote: > >> Please vote on releasing the following candidate as Apache Spark version >> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC >> and passes if a majority of at least 3 PMC +1 votes are cast. >> >> >> [ ] +1 Release this package as Apache Spark 2.3.0 >> >> [ ] -1 Do not release this package because ... >> >> >> To learn more about Apache Spark, please see https://spark.apache.org/ >> >> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/ >> spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) >> >> List of JIRA tickets resolved in this release can be found here: >> https://issues.apache.org/jira/projects/SPARK/versions/12339551 >> >> The release files, including signatures, digests, etc. can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ >> >> Release artifacts are signed with the following key: >> https://dist.apache.org/repos/dist/dev/spark/KEYS >> >> The staging repository for this release can be found at: >> https://repository.apache.org/content/repositories/orgapachespark-1265/ >> >> The documentation corresponding to this release can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- >> docs/_site/index.html >> >> >> FAQ >> >> === >> What are the unresolved issues targeted for 2.3.0? >> === >> >> Please see https://s.apache.org/oXKi. At the time of writing, there are >> currently no known release blockers. >> >> = >> How can I help test this release? >> = >> >> If you are a Spark user, you can help us test this release by taking an >> existing Spark workload and running on this release candidate, then >> reporting any regressions. >> >> If you're working in PySpark you can set up a virtual env and install the >> current RC and see if anything important breaks, in the Java/Scala you can >> add the staging repository to your projects resolvers and test with the RC >> (make sure to clean up the artifact cache before/after so you don't end up >> building with a out of date RC going forward). >> >> === >> What should happen to JIRA tickets still targeting 2.3.0? >> === >> >> Committers should look at those and triage. Extremely important bug >> fixes, documentation, and API tweaks that impact compatibility should be >> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as >> appropriate. >> >> === >> Why is my bug not fixed? >> === >> >> In order to make timely releases, we will typically not hold the release >> unless the bug in question is a regression from 2.2.0. That being said, if >> there is something which is a regression from 2.2.0 and has not been >> correctly targeted please ping me or a committer to help target the issue >> (you can see the open issues listed as impacting Spark 2.3.0 at >> https://s.apache.org/WmoI). >> >
Re: [VOTE] Spark 2.3.0 (RC4)
+1 from me as last time, same outcome. I saw one test fail, but passed on a second run, so just seems flaky. - subscribing topic by name from latest offsets (failOnDataLoss: true) *** FAILED *** Error while stopping stream: query.exception() is not empty after clean stop: org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed. === Streaming Query === Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96] Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}} Current Available Offsets: {} Current State: TERMINATED Thread State: RUNNABLE On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal wrote: > Please vote on releasing the following candidate as Apache Spark version > 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC > and passes if a majority of at least 3 PMC +1 votes are cast. > > > [ ] +1 Release this package as Apache Spark 2.3.0 > > [ ] -1 Do not release this package because ... > > > To learn more about Apache Spark, please see https://spark.apache.org/ > > The tag to be voted on is v2.3.0-rc4: > https://github.com/apache/spark/tree/v2.3.0-rc4 > (44095cb65500739695b0324c177c19dfa1471472) > > List of JIRA tickets resolved in this release can be found here: > https://issues.apache.org/jira/projects/SPARK/versions/12339551 > > The release files, including signatures, digests, etc. can be found at: > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ > > Release artifacts are signed with the following key: > https://dist.apache.org/repos/dist/dev/spark/KEYS > > The staging repository for this release can be found at: > https://repository.apache.org/content/repositories/orgapachespark-1265/ > > The documentation corresponding to this release can be found at: > > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html > > > FAQ > > === > What are the unresolved issues targeted for 2.3.0? > === > > Please see https://s.apache.org/oXKi. At the time of writing, there are > currently no known release blockers. > > = > How can I help test this release? > = > > If you are a Spark user, you can help us test this release by taking an > existing Spark workload and running on this release candidate, then > reporting any regressions. > > If you're working in PySpark you can set up a virtual env and install the > current RC and see if anything important breaks, in the Java/Scala you can > add the staging repository to your projects resolvers and test with the RC > (make sure to clean up the artifact cache before/after so you don't end up > building with a out of date RC going forward). > > === > What should happen to JIRA tickets still targeting 2.3.0? > === > > Committers should look at those and triage. Extremely important bug fixes, > documentation, and API tweaks that impact compatibility should be worked on > immediately. Everything else please retarget to 2.3.1 or 2.4.0 as > appropriate. > > === > Why is my bug not fixed? > === > > In order to make timely releases, we will typically not hold the release > unless the bug in question is a regression from 2.2.0. That being said, if > there is something which is a regression from 2.2.0 and has not been > correctly targeted please ping me or a committer to help target the issue > (you can see the open issues listed as impacting Spark 2.3.0 at > https://s.apache.org/WmoI). >
Re: [VOTE] Spark 2.3.0 (RC4)
Quick questions: is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml From: Sameer Agarwal Sent: Saturday, February 17, 2018 1:43:39 PM To: Sameer Agarwal Cc: dev Subject: Re: [VOTE] Spark 2.3.0 (RC4) I'll start with a +1 once again. All blockers reported against RC3 have been resolved and the builds are healthy. On 17 February 2018 at 13:41, Sameer Agarwal mailto:samee...@apache.org>> wrote: Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast. [ ] +1 Release this package as Apache Spark 2.3.0 [ ] -1 Do not release this package because ... To learn more about Apache Spark, please see https://spark.apache.org/ The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551 The release files, including signatures, digests, etc. can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ Release artifacts are signed with the following key: https://dist.apache.org/repos/dist/dev/spark/KEYS The staging repository for this release can be found at: https://repository.apache.org/content/repositories/orgapachespark-1265/ The documentation corresponding to this release can be found at: https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/index.html FAQ === What are the unresolved issues targeted for 2.3.0? === Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers. = How can I help test this release? = If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions. If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward). === What should happen to JIRA tickets still targeting 2.3.0? === Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate. === Why is my bug not fixed? === In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI). -- Sameer Agarwal Computer Science | UC Berkeley http://cs.berkeley.edu/~sameerag
Re: [VOTE] Spark 2.3.0 (RC4)
I'll start with a +1 once again. All blockers reported against RC3 have been resolved and the builds are healthy. On 17 February 2018 at 13:41, Sameer Agarwal wrote: > Please vote on releasing the following candidate as Apache Spark version > 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC > and passes if a majority of at least 3 PMC +1 votes are cast. > > > [ ] +1 Release this package as Apache Spark 2.3.0 > > [ ] -1 Do not release this package because ... > > > To learn more about Apache Spark, please see https://spark.apache.org/ > > The tag to be voted on is v2.3.0-rc4: https://github.com/apache/ > spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472) > > List of JIRA tickets resolved in this release can be found here: > https://issues.apache.org/jira/projects/SPARK/versions/12339551 > > The release files, including signatures, digests, etc. can be found at: > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/ > > Release artifacts are signed with the following key: > https://dist.apache.org/repos/dist/dev/spark/KEYS > > The staging repository for this release can be found at: > https://repository.apache.org/content/repositories/orgapachespark-1265/ > > The documentation corresponding to this release can be found at: > https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4- > docs/_site/index.html > > > FAQ > > === > What are the unresolved issues targeted for 2.3.0? > === > > Please see https://s.apache.org/oXKi. At the time of writing, there are > currently no known release blockers. > > = > How can I help test this release? > = > > If you are a Spark user, you can help us test this release by taking an > existing Spark workload and running on this release candidate, then > reporting any regressions. > > If you're working in PySpark you can set up a virtual env and install the > current RC and see if anything important breaks, in the Java/Scala you can > add the staging repository to your projects resolvers and test with the RC > (make sure to clean up the artifact cache before/after so you don't end up > building with a out of date RC going forward). > > === > What should happen to JIRA tickets still targeting 2.3.0? > === > > Committers should look at those and triage. Extremely important bug fixes, > documentation, and API tweaks that impact compatibility should be worked on > immediately. Everything else please retarget to 2.3.1 or 2.4.0 as > appropriate. > > === > Why is my bug not fixed? > === > > In order to make timely releases, we will typically not hold the release > unless the bug in question is a regression from 2.2.0. That being said, if > there is something which is a regression from 2.2.0 and has not been > correctly targeted please ping me or a committer to help target the issue > (you can see the open issues listed as impacting Spark 2.3.0 at > https://s.apache.org/WmoI). > -- Sameer Agarwal Computer Science | UC Berkeley http://cs.berkeley.edu/~sameerag