that's a good catch Felix! I would recommend to cast this exception to a
warning and move it to a central place like SparkExecutionContext to ensure
consistency across all APIs and deployments.

Regards,
Matthias




From:   Deron Eriksson <deroneriks...@gmail.com>
To:     dev@systemml.incubator.apache.org
Date:   02/20/2017 02:14 PM
Subject:        Re: Minimum required Spark version



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM, <fschue...@posteo.de> wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/


Reply via email to