https://github.com/apache/spark/pull/22514 sounds like a regression that
affects Hive CTAS in write path (by not replacing them into Spark internal
datasources; therefore performance regression).
but yea I suspect if we should block the release by this.

https://github.com/apache/spark/pull/22144 is just being discussed if I am
not mistaken.

Thanks.

2018년 10월 24일 (수) 오전 12:27, Xiao Li <gatorsm...@gmail.com>님이 작성:

> https://github.com/apache/spark/pull/22144 is also not a blocker of Spark
> 2.4 release, as discussed in the PR.
>
> Thanks,
>
> Xiao
>
> Xiao Li <gatorsm...@gmail.com> 于2018年10月23日周二 上午9:20写道:
>
>> Thanks for reporting this. https://github.com/apache/spark/pull/22514 is
>> not a blocker. We can fix it in the next minor release, if we are unable to
>> make it in this release.
>>
>> Thanks,
>>
>> Xiao
>>
>> Sean Owen <sro...@gmail.com> 于2018年10月23日周二 上午9:14写道:
>>
>>> (I should add, I only observed this with the Scala 2.12 build. It all
>>> seemed to work with 2.11. Therefore I'm not too worried about it. I
>>> don't think it's a Scala version issue, but perhaps something looking
>>> for a spark 2.11 tarball and not finding it. See
>>> https://github.com/apache/spark/pull/22805#issuecomment-432304622 for
>>> a change that might address this kind of thing.)
>>>
>>> On Tue, Oct 23, 2018 at 11:05 AM Sean Owen <sro...@gmail.com> wrote:
>>> >
>>> > Yeah, that's maybe the issue here. This is a source release, not a git
>>> checkout, and it still needs to work in this context.
>>> >
>>> > I just added -Pkubernetes to my build and didn't do anything else. I
>>> think the ideal is that a "mvn -P... -P... install" to work from a source
>>> release; that's a good expectation and consistent with docs.
>>> >
>>> > Maybe these tests simply don't need to run with the normal suite of
>>> tests, and can be considered tests run manually by developers running these
>>> scripts? Basically, KubernetesSuite shouldn't run in a normal mvn install?
>>> >
>>> > I don't think this has to block the release even if so, just trying to
>>> get to the bottom of it.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>

Reply via email to