The JIRA is here: https://issues.apache.org/jira/browse/PHOENIX-848. There's
not much work required as 0.96 is almost compatible with 0.96 wrt the APIs
that Phoenix uses.

Thanks,
James

On Thursday, July 3, 2014, anil gupta <[email protected]> wrote:

> Hey Mike,
>
> Cloudera is bumping the version of HBase to 0.98+ in cdh5.1 release which
> is due to come out this month. So, maybe if you can wait 3-4 weeks then you
> would be able to use Phoenix out of the box without any porting needs.
> PS: I fail to understand why HBase had 2 major releases(0.96 and 0.98) in
> such a short span of time. This is troubling a lot of users.
>
> ~Anil Gupta
>
>
> On Wed, Jul 2, 2014 at 10:12 PM, Jesse Yates <[email protected]
> <javascript:;>>
> wrote:
>
> > Well there are a couple of options. Obviously, you could not use
> phoenix...
> > But that's probably not what you want to hear :)
> >
> > Since this is an open source project, if you have decided you want it in
> > production, then you could port the 4.x line to HBase 0.96. The
> Hortonworks
> > fellas did the work to port from 0.94 -> 0.98, but they obviously don't
> > have same interest in supporting a Cloudera distribution (just business,
> > eh?). Its probably not a ton of work since 0.96 and 0.98 are pretty close
> > and lots of people have been asking for it, so you'd be the hero of the
> > hour.
> >
> > I've heard talk James might spend the time for the port, but I can't say
> > when/if that actually will happen, especially as Salesforce is moving
> > towards 0.98.
> >
> > You could also convince the Cloudera fellas to do the work, since quite a
> > few of their customers have been interested in using Phoenix. But they
> have
> > other products to push in the SQL line, so maybe not as much luck there
> as
> > one would hope... Can't say, I don't work there, but that's just how it
> > seems sometimes. I also here if you pay them lots of money they will
> > support almost any setup you've got :)
> >
> > As I mentioned above, Hortonworks did port the code to 0.98, so they do
> > have a distribution that's supports Phoenix. Another option, but I
> > understand the vendor lock in that can occur (see the hortonworks fellas
> > for their pitch).
> >
> > So, in summary, you're options look to me:
> > - don't use phoenix
> > - port it to 0.96 yourself
> > - hope someone else ports it
> > - pay someone else to port it
> > - switch distros and use 0.98
> >
> > James might be able to shed some light if he is going to do the work (but
> > no pressure, eh boss?), but otherwise, that seems like it to me.
> >
> > - j
> >  On Jul 2, 2014 9:37 PM, "Mike Friedman" <[email protected]
> <javascript:;>> wrote:
> >
> > > James,
> > >
> > > CDH 5 is the current version of CDH. Cloudera doesn't allow upgrading
> to
> > > HBase 0.98.1 on a CDH licensed cluster and still have a supported
> > > configuration. This makes it hard to put Phoenix into use in a
> production
> > > environment. Please advise.
> > >
> > > Thanks.
> > >
> > >
> > > Mike
> > >
> > > > On Jul 2, 2014, at 12:57 PM, "James Taylor" <[email protected]
> <javascript:;>>
> > > wrote:
> > > >
> > > > Phoenix doesn't currently support HBase 0.96, but only 0.98.1+.
> > > >
> > > > Thanks,
> > > > James
> > > >
> > > >> On Wednesday, July 2, 2014, Jeffrey Zhong <[email protected]
> <javascript:;>>
> > > wrote:
> > > >>
> > > >>
> > > >> Have you put phoenix-core-*.jar into your hbase region server &
> master
> > > >> classpath and restart your region server & master?
> > > >>
> > > >> -Jeffrey
> > > >>
> > > >> On 7/2/14 8:05 AM, "Mike Friedman" <[email protected]
> <javascript:;>
> > > <javascript:;>>
> > > >> wrote:
> > > >>
> > > >>> Hi,
> > > >>>
> > > >>> We are running HBase Version 0.96.1.1-cdh5.0.2 from CDH 5.02
> deployed
> > > >>> using Cloudera Manager 5.0. Is there a version of Phoenix that is
> > > >>> compatible with this version of HBase? We have tried several
> versions
> > > of
> > > >>> Phoenix old and new and when starting HBase  get the error
> > > >>>
> > > >>> 1:51:25.022 PM INFO
> > > >>> org.apache.hadoop.hbase.regionserver.HRegionServer
> > > >>> STOPPED: Failed initialization
> > > >>> 1:51:25.026 PM ERROR
> > >  org.apache.hadoop.hbase.regionserver.HRegionServer
> > > >>>
> > > >>> Failed init
> > > >>> java.lang.NoClassDefFoundError:
> > > >>> org/apache/hadoop/hbase/regionserver/wal/WALEditCodec
> > > >>>               at java.lang.ClassLoader.defineClass1(Native Method)
> > > >>>               at
> > > java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> > > >>>               at
> > > >>>
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> > > >>>               at
> > > >>> java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> > > >>>               at
> > > >>> java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > >>>               at java.security.AccessController.doPrivileged(Native
> > > >>> Method)
> > > >>>               at
> > > >>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> > > >>>               at
> > > >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> > > >>>               at java.lang.Class.forName0(Native Method)
> > > >>>               at java.lang.Class.forName(Class.java:190)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(Ref
> > > >>> lectionUtils.java:31)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.
> > > >>> java:79)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter.init(ProtobufLo
> > > >>> gWriter.java:72)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createWriter(HLogFact
> > > >>> ory.java:194)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createWALWriter(HLogF
> > > >>> actory.java:177)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.createWriterInstance(FSHLo
> > > >>> g.java:588)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.rollWriter(FSHLog.java:519
> > > >>> )
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.rollWriter(FSHLog.java:476
> > > >>> )
> > > >>>               at
> > > >>>
> > org.apache.hadoop.hbase.regionserver.wal.FSHLog.<init>(FSHLog.java:383)
> > > >>>               at
> > > >>>
> > org.apache.hadoop.hbase.regionserver.wal.FSHLog.<init>(FSHLog.java:295)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createHLog(HLogFactor
> > > >>> y.java:56)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.instantiateHLog(HRegion
> > > >>> Server.java:1443)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.setupWALAndReplication(
> > > >>> HRegionServer.java:1422)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResp
> > > >>> onse(HRegionServer.java:1184)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:
> > > >>> 786)
> > > >>>               at java.lang.Thread.run(Thread.java:745)
> > > >>> Caused by: java.lang.ClassNotFoundException:
> > > >>> org.apache.hadoop.hbase.regionserver.wal.WALEditCodec
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > >>>               at java.security.AccessController.doPrivileged(Native
> > > >>> Method)
> > > >>>               at
> > > >>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> > > >>>               at
> > > >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> > > >>>               ... 30 more
> > > >>> 1:51:25.052 PM FATAL
> > > org.apache.hadoop.hbase.regionserver.HRegionServer
> > > >>>
> > > >>> ABORTING region server blvdevhdp05.ds-iq.corp,60020,1403902281899:
> > > >>> Unhandled: Region server startup failed
> > > >>> java.io.IOException: Region server startup failed
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(H
> > > >>> RegionServer.java:2644)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResp
> > > >>> onse(HRegionServer.java:1199)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:
> > > >>> 786)
> > > >>>               at java.lang.Thread.run(Thread.java:745)
> > > >>> Caused by: java.lang.NoClassDefFoundError:
> > > >>> org/apache/hadoop/hbase/regionserver/wal/WALEditCodec
> > > >>>               at java.lang.ClassLoader.defineClass1(Native Method)
> > > >>>               at
> > > java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> > > >>>               at
> > > >>>
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> > > >>>               at
> > > >>> java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> > > >>>               at
> > > >>> java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > >>>               at java.security.AccessController.doPrivileged(Native
> > > >>> Method)
> > > >>>               at
> > > >>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> > > >>>               at
> > > >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> > > >>>               at java.lang.Class.forName0(Native Method)
> > > >>>               at java.lang.Class.forName(Class.java:190)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(Ref
> > > >>> lectionUtils.java:31)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.
> > > >>> java:79)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter.init(ProtobufLo
> > > >>> gWriter.java:72)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createWriter(HLogFact
> > > >>> ory.java:194)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createWALWriter(HLogF
> > > >>> actory.java:177)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.createWriterInstance(FSHLo
> > > >>> g.java:588)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.rollWriter(FSHLog.java:519
> > > >>> )
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.FSHLog.rollWriter(FSHLog.java:476
> > > >>> )
> > > >>>               at
> > > >>>
> > org.apache.hadoop.hbase.regionserver.wal.FSHLog.<init>(FSHLog.java:383)
> > > >>>               at
> > > >>>
> > org.apache.hadoop.hbase.regionserver.wal.FSHLog.<init>(FSHLog.java:295)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.HLogFactory.createHLog(HLogFactor
> > > >>> y.java:56)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.instantiateHLog(HRegion
> > > >>> Server.java:1443)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.setupWALAndReplication(
> > > >>> HRegionServer.java:1422)
> > > >>>               at
> > > >>>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResp
> > > >>> onse(HRegionServer.java:1184)
> > > >>>               ... 2 more
> > > >>> Caused by: java.lang.ClassNotFoundException:
> > > >>> org.apache.hadoop.hbase.regionserver.wal.WALEditCodec
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > > >>>               at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > >>>               at java.security.AccessController.doPrivileged(Native
> > > >>> Method)
> > > >>>               at
> > > >>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> > > >>>               at
> > > >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > >>>               at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> > > >>>               ... 30 more
> > > >>> 1:51:25.056 PM FATAL
> > > org.apache.hadoop.hbase.regionserver.HRegionServer
> > > >>>
> > > >>> RegionServer abort: loaded coprocessors are: []
> > > >>> 1:51:25.098 PM INFO
> > > >>> org.apache.hadoop.hbase.regionserver.HRegionServer
> > > >>> STOPPED: Unhandled: Region server startup failed
> > > >>>
> > > >>> We are using the following config in the hbase-site.xml file:
> > > >>>
> > > >>> <property>
> > > >>> <name>hbase.regionserver.wal.codec</name>
> > > >>>
> > > >>>
> > >
> >
> <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value
> > > >>> </property>
> > > >>>
> > > >>>
> > > >>>
> > > >>> Thanks.
> > > >>>
> > > >>>
> > > >>>
> > > >>>
> > > >>>
> > > >>>
> > > >>> Mike
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> CONFIDENTIALITY NOTICE
> > > >> NOTICE: This message is intended for the use of the individual or
> > > entity to
> > > >> which it is addressed and may contain information that is
> > confidential,
> > > >> privileged and exempt from disclosure under applicable law. If the
> > > reader
> > > >> of this message is not the intended recipient, you are hereby
> notified
> > > that
> > > >> any printing, copying, dissemination, distribution, disclosure or
> > > >> forwarding of this communication is strictly prohibited. If you have
> > > >> received this communication in error, please contact the sender
> > > immediately
> > > >> and delete it from your system. Thank You.
> > > >>
> > >
> >
>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>

Reply via email to