Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM, <fschue...@posteo.de> wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0? Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



-- 
Deron Eriksson
Spark Technology Center
http://www.spark.tc/

Reply via email to