Could you mark that bug as blocker and set the target version, in that case?

On Thu, Jun 28, 2018 at 8:46 AM, Felix Cheung <felixcheun...@hotmail.com>
wrote:

> +1
>
> I’d like to fix SPARK-24535 first though
>
> ------------------------------
> *From:* Stavros Kontopoulos <stavros.kontopou...@lightbend.com>
> *Sent:* Thursday, June 28, 2018 3:50:34 AM
> *To:* Marco Gaido
> *Cc:* Takeshi Yamamuro; Xingbo Jiang; Wenchen Fan; Spark dev list; Saisai
> Shao; van...@cloudera.com.invalid
> *Subject:* Re: Time for 2.3.2?
>
> +1 makes sense.
>
> On Thu, Jun 28, 2018 at 12:07 PM, Marco Gaido <marcogaid...@gmail.com>
> wrote:
>
>> +1 too, I'd consider also to include SPARK-24208 if we can solve it
>> timely...
>>
>> 2018-06-28 8:28 GMT+02:00 Takeshi Yamamuro <linguin....@gmail.com>:
>>
>>> +1, I heard some Spark users have skipped v2.3.1 because of these bugs.
>>>
>>> On Thu, Jun 28, 2018 at 3:09 PM Xingbo Jiang <jiangxb1...@gmail.com>
>>> wrote:
>>>
>>>> +1
>>>>
>>>> Wenchen Fan <cloud0...@gmail.com>于2018年6月28日 周四下午2:06写道:
>>>>
>>>>> Hi Saisai, that's great! please go ahead!
>>>>>
>>>>> On Thu, Jun 28, 2018 at 12:56 PM Saisai Shao <sai.sai.s...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> +1, like mentioned by Marcelo, these issues seems quite severe.
>>>>>>
>>>>>> I can work on the release if short of hands :).
>>>>>>
>>>>>> Thanks
>>>>>> Jerry
>>>>>>
>>>>>>
>>>>>> Marcelo Vanzin <van...@cloudera.com.invalid> 于2018年6月28日周四 上午11:40写道:
>>>>>>
>>>>>>> +1. SPARK-24589 / SPARK-24552 are kinda nasty and we should get fixes
>>>>>>> for those out.
>>>>>>>
>>>>>>> (Those are what delayed 2.2.2 and 2.1.3 for those watching...)
>>>>>>>
>>>>>>> On Wed, Jun 27, 2018 at 7:59 PM, Wenchen Fan <cloud0...@gmail.com>
>>>>>>> wrote:
>>>>>>> > Hi all,
>>>>>>> >
>>>>>>> > Spark 2.3.1 was released just a while ago, but unfortunately we
>>>>>>> discovered
>>>>>>> > and fixed some critical issues afterward.
>>>>>>> >
>>>>>>> > SPARK-24495: SortMergeJoin may produce wrong result.
>>>>>>> > This is a serious correctness bug, and is easy to hit: have
>>>>>>> duplicated join
>>>>>>> > key from the left table, e.g. `WHERE t1.a = t2.b AND t1.a = t2.c`,
>>>>>>> and the
>>>>>>> > join is a sort merge join. This bug is only present in Spark 2.3.
>>>>>>> >
>>>>>>> > SPARK-24588: stream-stream join may produce wrong result
>>>>>>> > This is a correctness bug in a new feature of Spark 2.3: the
>>>>>>> stream-stream
>>>>>>> > join. Users can hit this bug if one of the join side is
>>>>>>> partitioned by a
>>>>>>> > subset of the join keys.
>>>>>>> >
>>>>>>> > SPARK-24552: Task attempt numbers are reused when stages are
>>>>>>> retried
>>>>>>> > This is a long-standing bug in the output committer that may
>>>>>>> introduce data
>>>>>>> > corruption.
>>>>>>> >
>>>>>>> > SPARK-24542: UDFXPathXXXX allow users to pass carefully crafted
>>>>>>> XML to
>>>>>>> > access arbitrary files
>>>>>>> > This is a potential security issue if users build access control
>>>>>>> module upon
>>>>>>> > Spark.
>>>>>>> >
>>>>>>> > I think we need a Spark 2.3.2 to address these issues(especially
>>>>>>> the
>>>>>>> > correctness bugs) ASAP. Any thoughts?
>>>>>>> >
>>>>>>> > Thanks,
>>>>>>> > Wenchen
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Marcelo
>>>>>>>
>>>>>>> ------------------------------------------------------------
>>>>>>> ---------
>>>>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>>>
>>>>>>>
>>>
>>> --
>>> ---
>>> Takeshi Yamamuro
>>>
>>
>>
>
>
> --
> Stavros Kontopoulos
>
> *Senior Software Engineer *
> *Lightbend, Inc. *
>
> *p:  +30 6977967274 <%2B1%20650%20678%200020>*
> *e: stavros.kontopou...@lightbend.com* <dave.mar...@lightbend.com>
>
>
>


-- 
Marcelo

Reply via email to