Yup, no worries. I roughly set the one week delay considering the official
release date :D

On Mon, 9 Sep 2019, 09:45 Dongjoon Hyun, <dongjoon.h...@gmail.com> wrote:

> Thank you, Hyukjin.
>
> +1 for closing according to 2.3.x EOL.
>
> For the timing, please do that after the official 2.3.4 release
> announcement.
>
> Bests,
> Dongjoon.
>
> On Sun, Sep 8, 2019 at 16:27 Sean Owen <sro...@gmail.com> wrote:
>
>> I think simply closing old issues with no activity in a long time is
>> OK. The "Affected Version" is somewhat noisy, so not even particularly
>> important to also query, but yeah I see some value in trying to limit
>> the scope this way.
>>
>> On Sat, Sep 7, 2019 at 10:15 PM Hyukjin Kwon <gurwls...@gmail.com> wrote:
>> >
>> > HI all,
>> >
>> > We have resolved JIRAs that targets EOL releases (up to Spark 2.2.x) in
>> order to make it
>> > the manageable size before.
>> > Since Spark 2.3.4 will be EOL release, I plan to do this again roughly
>> in a week.
>> >
>> > The JIRAs that has not been updated for the last year, and having
>> affect version of EOL releases will be:
>> >   - Resolved as 'Incomplete' status
>> >   - Has a 'bulk-closed' label.
>> >
>> > I plan to use this JQL
>> >
>> > project = SPARK
>> >   AND status in (Open, "In Progress", Reopened)
>> >   AND (
>> >     affectedVersion = EMPTY OR
>> >     NOT (affectedVersion in versionMatch("^3.*")
>> >       OR affectedVersion in versionMatch("^2.4.*")
>> >     )
>> >   )
>> >   AND updated <= -52w
>> >
>> >
>> > You could click this link and check.
>> >
>> >
>> https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20(affectedVersion%20%3D%20EMPTY%20OR%20NOT%20(affectedVersion%20in%20versionMatch(%22%5E3.*%22)%20OR%20affectedVersion%20in%20versionMatch(%22%5E2.4.*%22)))%20AND%20updated%20%3C%3D%20-52w
>> >
>> > Please let me know if you guys have any concern or opinion on this.
>> >
>> > Thanks.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to