Here's the list of packages I see breaking on
https://ci.bigtop.apache.org/job/Bigtop-trunk-packages/ due to the Spark 2
upgrade and that probably can be changed temporarily to depend upon spark1
instead of spark:

* crunch
* hive
* ignite-hadoop
* mahout
* phoenix

The spark package itself is also failing because I seem to have missed a
change regarding the removal of --pyspark-python from install_spark.sh in
my https://issues.apache.org/jira/browse/BIGTOP-2569 patch.

Lastly, zeppelin is failing because
https://issues.apache.org/jira/browse/BIGTOP-2514 has not been committed
yet.

Did I miss anything?

~ Jonathan

On Thu, Nov 17, 2016 at 9:43 AM Jonathan Kelly <[email protected]>
wrote:

> What are the build breaks? Are you referring to applications that depend
> upon Spark but don't work with Spark 2 yet? There are some patches that I
> can probably contribute for fixing these build breaks (your option #2,
> sorta).
>
> Btw, if there are problems that we can't quite fix very well with option
> #2, I'd prefer changing the broken applications to depend upon spark1
> rather than your option #1/3. We decided on
> https://issues.apache.org/jira/browse/BIGTOP-2569 that it's best not to
> have Spark 2 as "spark2", and I'd argue that it's best not to do that even
> temporarily (option #3).
>
>
> ~ Jonathan
>
> On Thu, Nov 17, 2016 at 8:30 AM MrAsanjar . <[email protected]> wrote:
>
> Guys we have three choices:
> 1) Make spark 1.6.2 the default spark component and create a new spark2
> component for spark 2.0.1
> 2) Upgrade All components to the versions that support Spark 2.
> 3) option 1 as a short term fix while gradually work on option 2
>
> Your thoughts ?
>
>

Reply via email to