hi jonathan,

regarding your hotfix

please do not modify/upload patches in resolved jiras

please open a new one, attach it there and do not forget to use proper summary. 
This will increase your karma !

i am on the flight /rail back. Will handle this tomorrow evening . (my cet time 
zone)

olaf

> Am 17.11.2016 um 19:11 schrieb Jonathan Kelly <[email protected]>:
> 
> Not sure how you want to handle this, but I just uploaded another patch to
> https://issues.apache.org/jira/browse/BIGTOP-2569 that fixes the issue with
> the spark RPM build. Sorry about missing this the first time.
> 
> As for dividing and conquering, unfortunately I can't really work on fixing
> the other apps right now. :(
> 
> As for Spark 2.0.2, yes, we can upgrade to that now. I've already tested
> it, and it's as simple as bumping the version in bigtop.bom.
> 
> Thanks,
> Jonathan
> 
>> On Thu, Nov 17, 2016 at 10:06 AM MrAsanjar . <[email protected]> wrote:
>> 
>> could we upgrade to Spark 2.0.2, it looks like it was released few days
>> ago.. is it stable enough?
>> 
>>> On Thu, Nov 17, 2016 at 7:02 PM, MrAsanjar . <[email protected]> wrote:
>>> 
>>> Great, lets divide and concur .
>>> I will do phoenix and hive.. going to open JIRAs for five components
>>> 
>>>> On Thu, Nov 17, 2016 at 6:59 PM, MrAsanjar . <[email protected]> wrote:
>>>> 
>>>> Hive, Phoenix, ignite-hadoop, crunch, and zeppelin.
>>>> Jonathan, PLEASE don't take this the wrong way, no one could have
>>>> predicted that. Personally, the only one I knew was zeppelin :) anyway,
>> I
>>>> am staring with Phoenix . I believe if we get more community member
>>>> involve, we can upgrade all components in no time. We have to do it for
>>>> Bigtop Release v1.2, anyway.
>>>> 
>>>> On Thu, Nov 17, 2016 at 6:43 PM, Jonathan Kelly <[email protected]
>>> 
>>>> wrote:
>>>> 
>>>>> What are the build breaks? Are you referring to applications that
>> depend
>>>>> upon Spark but don't work with Spark 2 yet? There are some patches
>> that I
>>>>> can probably contribute for fixing these build breaks (your option #2,
>>>>> sorta).
>>>>> 
>>>>> Btw, if there are problems that we can't quite fix very well with
>> option
>>>>> #2, I'd prefer changing the broken applications to depend upon spark1
>>>>> rather than your option #1/3. We decided on
>>>>> https://issues.apache.org/jira/browse/BIGTOP-2569 that it's best not
>> to
>>>>> have Spark 2 as "spark2", and I'd argue that it's best not to do that
>>>>> even
>>>>> temporarily (option #3).
>>>>> 
>>>>> ~ Jonathan
>>>>> 
>>>>> On Thu, Nov 17, 2016 at 8:30 AM MrAsanjar . <[email protected]>
>> wrote:
>>>>> 
>>>>>> Guys we have three choices:
>>>>>> 1) Make spark 1.6.2 the default spark component and create a new
>> spark2
>>>>>> component for spark 2.0.1
>>>>>> 2) Upgrade All components to the versions that support Spark 2.
>>>>>> 3) option 1 as a short term fix while gradually work on option 2
>>>>>> 
>>>>>> Your thoughts ?
>>>>>> 
>>>>> 
>>>> 
>>>> 
>>> 
>> 

Reply via email to