BTW: I am very likely to install hive 0.12 on hadoop 0.20.2 clusters. I have been running hive since version 0.2. I have been running hadoop since version 0.17.2. After 0.17.2 I moved to 0.20.2. Since then the hadoop has seemingly has 10's of releases. 0.21, 0.21.append (Dead on arrival) . cloudera this, cloudera that, yahoo hadoop distribution (dead on arrival), 0.20.2.203 0.20.2.205, 1? 2.0? 2.1. None of them really have much shelf life or a very clear upgrade path.
The only thing that has remained constant for our environment is hive and hadoop 0.20.2. I have been happily just upgrading hive on these clusters for years now. So in a nutshell, I'm a long time committer, and I actively support and develop hive on hadoop 0.20.2 clusters, I do not see supporting the shims as complicated or difficult. On Wed, Sep 18, 2013 at 7:02 PM, Owen O'Malley <omal...@apache.org> wrote: > On Wed, Sep 18, 2013 at 1:54 PM, Edward Capriolo <edlinuxg...@gmail.com > >wrote: > > > I am not fine with dropping it. I still run it in several places. > > > > The question is not whether you run Hadoop 0.20.2, but whether you are > likely to install Hive 0.12 on those very old clusters. > > > > > > Believe it or now many people still run 0.20.2. I believe (correct me If > I > > am wrong) facebook is still running a heavily patch 0.20.2. > > > > It is more accurate to say that Facebook is running a fork of Hadoop where > the last common point was Hadoop 0.20.1. I haven't heard anyone (other than > you in this thread) say they are running 0.20.2 in years. > > > > I could see dropping 0.20.2 if it was a huge burden but I do not see it > > that way, it work's it is reliable, and it is a known quantity. > > > > It is a large burden in that we have relatively complicated shims and a > lack of testing. Unless you are signing up to test every release on 0.20.2 > we don't have anyone doing the relevant testing. > > -- Owen >