So this is a resounding: "Maybe"? :)

I think we should change the default. HBase should be built from source for a 
specific version of Hadoop anyway.

Ted said +1.

Any other opinions?


-- Lars



________________________________
 From: Jean-Marc Spaggiari <[email protected]>
To: dev <[email protected]>; lars hofhansl <[email protected]> 
Sent: Wednesday, October 9, 2013 5:13 PM
Subject: Re: Default hadoop version for 0.94?
 


That's on my roadmap, but lacking some time for that ;) I bought a new cluster 
where I will install Hadoop 1.2.1 or 2.x and replicate my current cluster to 
it, but still need some time to do that :(

But the issue is not just me. Me, I can deal with that. But maybe others might 
be in the same situation?




2013/10/9 lars hofhansl <[email protected]>

That's what I was trying to find out :)
>You do not want to upgrade to Hadoop 1.2.1?
>
>
>-- Lars
>
>
>
>________________________________
> From: Jean-Marc Spaggiari <[email protected]>
>To: dev <[email protected]>; lars hofhansl <[email protected]>
>Sent: Wednesday, October 9, 2013 4:04 PM
>Subject: Re: Default hadoop version for 0.94?
>
>
>
>Hum. I usually just deploy the .jar on my cluster (hadoop 1.0.3) without
>rebuilding anything. Not sure if some others are not doing the same. That
>change will break the compatibility with previous version, no?
>
>
>
>2013/10/9 lars hofhansl <[email protected]>
>
>> Should we default the HBase 0.94 builds to Hadoop 1.2.x (1.2.1 currently)?
>> It's the current stable release of the Hadoop. Can't even download 1.0.4
>> anymore unless you navigate to the archive section.
>>
>> This would just apply to the sample packages in the download section for
>> HBase. User should really build a targeted version of HBase to their
>> version of Hadoop anyway.
>>
>> Comments?
>>
>>
>> -- Lars
>>

Reply via email to