OK, we can wait a tick to confirm there aren't strong objections.
I suppose I'd prefer someone who knows
https://issues.apache.org/jira/browse/SPARK-28344 to confirm it was
either erroneously targeted to 2.4, or else it's valid, but, not
critical for the RC. Hearing nothing else shortly, I'd untarget it.

SPARK-29578 is a tiny low-risk test change but probably worth picking
up to avoid failing on certain JDKs during testing. I'll make a
back-port, as this should be noncontroversial. (Not sure why I didn't
backport originally)

On Wed, Jan 29, 2020 at 3:27 PM Dongjoon Hyun <dongjoon.h...@gmail.com> wrote:
>
> Thanks, Sean.
>
> If there is no further objection to the mailing list,
> could you remove the `Target Version: 2.4.5` from the followings?
>
>     SPARK-28344 Fail the query if detect ambiguous self join
>     SPARK-29578 JDK 1.8.0_232 timezone updates cause "Kwajalein" test 
> failures again
>
> Then, after the regular RC preparation testing including the manual 
> integration tests,
> I can roll 2.4.5 RC2 next Monday (Feb. 3rd, PST) and all late blocker patches 
> will block 2.4.6 instead of causing RC failure.
>
> Bests,
> Dongjoon.
>
>
> On Wed, Jan 29, 2020 at 12:16 PM Sean Owen <sro...@gmail.com> wrote:
>>
>> OK, that's specific. It's always a judgment call whether to hold the release 
>> train for one more fix or not. Depends on how impactful it is (harm of 
>> releasing without it), and how big it is (harm of delaying release of other 
>> fixes further). I think we tend to weight regressions from a previous 2.4.x 
>> release more heavily; those are typically Blockers, otherwise not. Otherwise 
>> once RCs start, we're driving primarily to a no-Blocker release. The default 
>> should be to punt to 2.4.6 -- which can come relatively soon if one wants.
>>
>> SPARK-28125 is not even a bug, I'd argue, let alone Blocker. Looks like it 
>> was marked 'correctness' by the reporter. It's always been the case since 
>> Spark 1.0 (i.e. not a regression) that RDDs need to be deterministic for 
>> most of the semantics one expects to work out. If it isn't, many bets are 
>> off. I get that this is a 'gotcha', but it isn't even about the randomSplit. 
>> If anything recomputes the RDD, it could be different.
>>
>> SPARK-28067, I don't know anything about, but also is being reported as not 
>> a 2.4.x regression, and I don't see anyone working on it. For that reason, 
>> not sure it's a Blocker for 2.4.x.
>>
>> SPARK-30310 is not a 2.4.x regression either, nor particularly critical 
>> IMHO. Doesn't mean we can't back-port it to 2.4 though, and it's 'done' (in 
>> master)
>>
>> Anything else? not according to JIRA at least.
>>
>> I think it's valid to continue with RC2 assuming none of these are necessary 
>> for 2.4.5.
>> It's not wrong to 'wait' if there are strong feelings about something, but, 
>> if we can't see a reason to expect the situation changes in a week, 2 weeks, 
>> then, why? The release of 2.4.5 nowish doesn't necessarily make the release 
>> of said fix much further away -- in 2.4.6.
>>
>> On Wed, Jan 29, 2020 at 1:28 PM Dongjoon Hyun <dongjoon.h...@gmail.com> 
>> wrote:
>>>
>>>     > SPARK-28125 dataframes created by randomSplit have overlapping rows
>>>     >     Seems like something we should fix
>>>     > SPARK-28067 Incorrect results in decimal aggregation with whole-stage 
>>> code gen enabled
>>>     >     Seems like we should fix
>>>
>>> Here, I'm trying to narrow down our focus to the issues with `Explicit 
>>> Target Version` and continue to release. In other words, as a release 
>>> manager, I hope I can officially ignore the other correctness issues which 
>>> is not targeting to 2.4.5 explicitly.
>>>
>>> Most correctness issues are long-standing and cause behavior changes. 
>>> During maintenance RC vote, for those kind of issues, I hope we set the 
>>> Target Version `2.4.6` instead of casting a veto RC. It's the same policy 
>>> with Fix Version. During RC vote period, Fix Version is set to the next 
>>> version `2.4.6` instead of the current RC `2.4.5`. Since maintenance 
>>> happens more frequently, I believe that's okay.
>>>>
>>>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to