could we upgrade to Spark 2.0.2, it looks like it was released few days
ago.. is it stable enough?

On Thu, Nov 17, 2016 at 7:02 PM, MrAsanjar . <[email protected]> wrote:

> Great, lets divide and concur .
> I will do phoenix and hive.. going to open JIRAs for five components
>
> On Thu, Nov 17, 2016 at 6:59 PM, MrAsanjar . <[email protected]> wrote:
>
>> Hive, Phoenix, ignite-hadoop, crunch, and zeppelin.
>> Jonathan, PLEASE don't take this the wrong way, no one could have
>> predicted that. Personally, the only one I knew was zeppelin :) anyway, I
>> am staring with Phoenix . I believe if we get more community member
>> involve, we can upgrade all components in no time. We have to do it for
>> Bigtop Release v1.2, anyway.
>>
>> On Thu, Nov 17, 2016 at 6:43 PM, Jonathan Kelly <[email protected]>
>> wrote:
>>
>>> What are the build breaks? Are you referring to applications that depend
>>> upon Spark but don't work with Spark 2 yet? There are some patches that I
>>> can probably contribute for fixing these build breaks (your option #2,
>>> sorta).
>>>
>>> Btw, if there are problems that we can't quite fix very well with option
>>> #2, I'd prefer changing the broken applications to depend upon spark1
>>> rather than your option #1/3. We decided on
>>> https://issues.apache.org/jira/browse/BIGTOP-2569 that it's best not to
>>> have Spark 2 as "spark2", and I'd argue that it's best not to do that
>>> even
>>> temporarily (option #3).
>>>
>>> ~ Jonathan
>>>
>>> On Thu, Nov 17, 2016 at 8:30 AM MrAsanjar . <[email protected]> wrote:
>>>
>>> > Guys we have three choices:
>>> > 1) Make spark 1.6.2 the default spark component and create a new spark2
>>> > component for spark 2.0.1
>>> > 2) Upgrade All components to the versions that support Spark 2.
>>> > 3) option 1 as a short term fix while gradually work on option 2
>>> >
>>> > Your thoughts ?
>>> >
>>>
>>
>>
>

Reply via email to