In theory Spark 2.4 supports Python 3.4; would this mean it's now just tested vs 3.6? that's not out of the question, but can the older branches continue to test on older versions or is that super complex?
On Wed, Apr 10, 2019 at 1:37 PM shane knapp <skn...@berkeley.edu> wrote: > > details here (see most recent comments for current state of things): > https://issues.apache.org/jira/browse/SPARK-25079 > > my PR for these changes: > https://github.com/apache/spark/pull/24266 > > we're doing this because we need to support arrow 0.12.1: > https://issues.apache.org/jira/browse/SPARK-27276 > > a couple of things: > > * i won't be switching things over until april 11th or 12th at the earliest. > * this change will require a short (~15-20min) downtime to switch over from > 3.4 to 3.6. > * updating python will impact all active branches, so i will need to > backport this pr: https://github.com/apache/spark/pull/24266 > > question: which other branches should i be applying this change to? > definitely master and 2.4... also 2.3? 2.2? > > thanks in advance, > > shane > -- > Shane Knapp > UC Berkeley EECS Research / RISELab Staff Technical Lead > https://rise.cs.berkeley.edu --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org