Spark 3.0 will still use the Hadoop 2.7 profile by default, I think. Hadoop
2.7 profile is much more stable than Hadoop 3.2 profile.
On Thu, Oct 31, 2019 at 3:54 PM Sean Owen wrote:
> This isn't a big thing, but I see that the pyspark build includes
> Hadoop 2.7 rather than 3.2. Maybe later we
This isn't a big thing, but I see that the pyspark build includes
Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
3.2 by default.
Otherwise, the tests all seems to pass with JDK 8 / 11 with all
profiles enabled, so I'm +1 on it.
On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang
On Thu, Oct 31, 2019 at 4:30 PM Sean Owen wrote:
>
> . But it'd be cooler to call these major
> releases!
Maybe this is just semantics, but my point is the Scala project
already does call 2.12 to 2.13 a major release
e.g. from https://www.scala-lang.org/download/
"Note that different *major*
Yep, it's worse than that. Code compiled for 2.x is _not allowed_ to
work with 2.(x+1). I say this with all love for Scala and total
respect for how big improvements in what Scala does necessarily mean
bytecode-level incompatibility. But it'd be cooler to call these major
releases! even in Java,
On Wed, Oct 30, 2019 at 5:57 PM Sean Owen wrote:
> Or, frankly, maybe Scala should reconsider the mutual incompatibility
> between minor releases. These are basically major releases, and
> indeed, it causes exactly this kind of headache.
>
Not saying binary incompatibility is fun, but 2.12 to
i'm currently testing PyPy3.6 v7.2.0 w/this pull request:
https://github.com/apache/spark/pull/26330
On Wed, Oct 30, 2019 at 2:31 PM Maciej Szymkiewicz
wrote:
> Could we upgrade to PyPy3.6 v7.2.0?
> On 10/30/19 9:45 PM, Shane Knapp wrote:
>
> one quick thing: we currently test against
+1
On Thu, Oct 31, 2019 at 11:21 AM Bryan Cutler wrote:
> +1 for deprecating
>
> On Wed, Oct 30, 2019 at 2:46 PM Shane Knapp wrote:
>
>> sure. that shouldn't be too hard, but we've historically given very
>> little support to it.
>>
>> On Wed, Oct 30, 2019 at 2:31 PM Maciej Szymkiewicz <
>>
+1 for deprecating
On Wed, Oct 30, 2019 at 2:46 PM Shane Knapp wrote:
> sure. that shouldn't be too hard, but we've historically given very
> little support to it.
>
> On Wed, Oct 30, 2019 at 2:31 PM Maciej Szymkiewicz
> wrote:
>
>> Could we upgrade to PyPy3.6 v7.2.0?
>> On 10/30/19 9:45 PM,
Please vote on releasing the following candidate as Apache Spark version
3.0.0-preview.
The vote is open until November 3 PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.
[ ] +1 Release this package as Apache Spark 3.0.0-preview
[ ] -1 Do not release this package