On Mon, Feb 23, 2015 at 11:36 AM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> The first concern for Spark will probably be to ensure that we still build
> and test against Python 2.6, since that's the minimum version of Python we
> support.
>
> sounds good...  we can set up separate 2.6 builds on specific versions...
 this could allow you to easily differentiate between "baseline" and
"latest and greatest" if you wanted.  it'll have a little bit more
administrative overhead, due to more jobs needing configs, but offers more
flexibility.

let me know what you think.


> Otherwise this seems OK. We use numpy and other Python packages in
> PySpark, but I don't think we're pinned to any particular version of those
> packages.
>
> cool.  i'll start mucking about and let you guys know how it goes.

shane

Reply via email to