Agreed. Always best to build both Hadoop and HBase from source, specifying the right version on the Maven command line as you do, and then take the extra step to (re)package with native libs included.
On Thu, Mar 29, 2018 at 3:21 PM, rahul gidwani <[email protected]> wrote: > Hi Andrew, > > I was thinking about doing the same thing as you are doing now. It seems > like the safest approach, this will allow us to upgrade our hadoop binaries > independent of having to worry about doing anything with hbase. > > I think this should be the recommended approach in building hbase as well. > I can see operators forgetting to upgrade the hadoop packaged jars in hbase > when doing a minor hadoop upgrade. API compatibility between hadoop > versions doesn't guarantee anything about the compatibility of hadoop jars > and jni code. > > Thanks, > rahul > > > > > On Thu, Mar 29, 2018 at 1:52 PM, Andrew Purtell <[email protected]> > wrote: > > > FWIW, I build Hadoop and HBase binary tarballs. As a post build step, I > > extract lib/native/ from the Hadoop tarball, extract the HBase tarball, > > copy Hadoop's lib/native/ to HBase's lib/native/<platform> e.g. > > > > cp -a hadoop-2.7.5/lib/native hbase-1.4.2/lib/native/Linux-amd64-64 > > > > (you can learn your platform by untarring HBase and doing ./bin/hbase > > org.apache.hadoop.util.PlatformName) > > > > and then tar up the modified HBase distribution with native bits in > place, > > and send the tarball on through to its destination. > > > > This is little different from our current advice to, prior to production > > deploy, replace the Hadoop jars packaged in our convenience binary > > distribution with the actual version of Hadoop you are running in > > production. > > > > > > On Tue, Mar 27, 2018 at 2:37 PM, rahul gidwani <[email protected]> > wrote: > > > > > It seems like the recommended approach is to do this: > > > > > > Build your hbase with against a particular version of hadoop with maven > > > using the -Dhadoop-two.version=<some_hadoop_version> > > > > > > Then set your HBASE_LIRBRARY_PATH= $HADOOP_HOME/lib/native/Linux- > > amd64-64 > > > > > > You could run into problems if you are upgrading your hadoop server > > > binaries without updating your hbase client hadoop dependencies. If > > > there are any JNI changes in the hadoop code > > > this would make the hadoop jars and jni code incompatible. > > > > > > Or you compile against whatever version but ensure that whatever > > > hadoop client dependencies you pick up for hbase is from the > > > $HADOOP_HOME directory at runtime? > > > > > > Is there a recommended approach, because although there is no formal > > > API guarantee between the JARs and JNI code but they are tightly > > > coupled. > > > > > > Thanks > > > > > > rahul > > > > > > > > > > > -- > > Best regards, > > Andrew > > > > Words like orphans lost among the crosstalk, meaning torn from truth's > > decrepit hands > > - A23, Crosstalk > > > -- Best regards, Andrew Words like orphans lost among the crosstalk, meaning torn from truth's decrepit hands - A23, Crosstalk
