Re: Problem starting region server with Hbase version hbase-2.0.0

2018-06-08 Thread Josh Elser
You shouldn't be putting the phoenix-client.jar on the HBase server 
classpath.


There is specifically the phoenix-server.jar which is specifically built 
to be included in HBase (to avoid issues such as these).


Please remove all phoenix-client jars and provide the 
phoenix-5.0.0-server jar instead.


On 6/7/18 5:06 PM, Mich Talebzadeh wrote:

Thanks.

under $HBASE_HOME/lib for version 2 I swapped the phoenix client jar file
as below

phoenix-5.0.0-alpha-HBase-2.0-client.jar_ori
phoenix-4.8.1-HBase-1.2-client.jar

I then started HBASE-2 that worked fine.

For Hbase clients, i.e. the Hbase  connection from edge nodes etc, I will
keep using HBASE-1.2.6 which is the stable version and it connects
successfully to Hbase-2. This appears to be a working solution for now.

Regards

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 7 June 2018 at 21:03, Sean Busbey  wrote:


Your current problem is caused by this phoenix jar:



hduser@rhes75: /data6/hduser/hbase-2.0.0> find ./ -name '*.jar' -print
-exec jar tf {} \; | grep -E "\.jar$|StreamCapabilities" | grep -B 1
StreamCapabilities
./lib/phoenix-5.0.0-alpha-HBase-2.0-client.jar
org/apache/hadoop/hbase/util/CommonFSUtils$StreamCapabilities.class
org/apache/hadoop/fs/StreamCapabilities.class
org/apache/hadoop/fs/StreamCapabilities$StreamCapability.class


I don't know what version of Hadoop it's bundling or why, but it's one
that includes the StreamCapabilities interface, so HBase takes that to
mean it can check on capabilities. Since Hadoop 2.7 doesn't claim to
implement any, HBase throws its hands up.

I'd recommend you ask on the phoenix list how to properly install
phoenix such that you don't need to copy the jars into the HBase
installation. Hopefully the jar pointed out here is meant to be client
facing only and not installed into the HBase cluster.


On Thu, Jun 7, 2018 at 2:38 PM, Mich Talebzadeh
 wrote:

Hi,

Under Hbase Home directory I get

hduser@rhes75: /data6/hduser/hbase-2.0.0> find ./ -name '*.jar' -print
-exec jar tf {} \; | grep -E "\.jar$|StreamCapabilities" | grep -B 1
StreamCapabilities
./lib/phoenix-5.0.0-alpha-HBase-2.0-client.jar
org/apache/hadoop/hbase/util/CommonFSUtils$StreamCapabilities.class
org/apache/hadoop/fs/StreamCapabilities.class
org/apache/hadoop/fs/StreamCapabilities$StreamCapability.class
--
./lib/hbase-common-2.0.0.jar
org/apache/hadoop/hbase/util/CommonFSUtils$StreamCapabilities.class

for Hadoop home directory I get nothing

hduser@rhes75: /home/hduser/hadoop-2.7.3> find ./ -name '*.jar' -print
-exec jar tf {} \; | grep -E "\.jar$|StreamCapabilities" | grep -B 1
StreamCapabilities


Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=

AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

*




http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising

from

such loss, damage or destruction.



On 7 June 2018 at 15:39, Sean Busbey  wrote:


Somehow, HBase is getting confused by your installation and thinks it
can check for wether or not the underlying FileSystem implementation
(i.e. HDFS) provides hflush/hsync even though that ability is not
present in Hadoop 2.7. Usually this means there's a mix of Hadoop
versions on the classpath. While you do have both Hadoop 2.7.3 and
2.7.4, that mix shouldn't cause this kind of failure[1].

Please run this command and copy/paste the output in your HBase and
Hadoop installation directories:

find . -name '*.jar' -print -exec jar tf {} \; | grep -E
"\.jar$|StreamCapabilities" | grep -B 1 StreamCapabilities



[1]: As an aside, you should follow the guidance in our reference
guide from the section "Replace the Hadoop Bundled With HBase!" in the
Hadoop chapter: http://hbase.apache.org/book.html#hadoop

But as I mentioned, I don't think it's the underlying cause in this

case.


On Thu, Jun 7, 2018 at 8:41 AM, Mich Talebzadeh
 wrote:

Hi,

Please find below

*bin/hbase version*
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/data6/hduser/hbase-2.0.0/lib/phoenix-5.0.0-alpha-

HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in

Re: Odd cell result

2018-06-08 Thread Ted Yu
Which connector do you use for Spark 2.1.2 ?

Is there any code snippet which may reproduce what you experienced ?

Which hbase release are you using ?

Thanks

On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo  wrote:

> Hello, Users
>
> I recently met an unusual situation.
> That is the cell result does not contain column family.
>
> I thought the cell is the smallest unit where data could be transferred in
> HBase.
> But cell does not contain column family means the cell is not the smallest
> unit.
> I'm wrong?
>
> It occurred in Spark 2.1.2 and did not occur in MR.
> And now it is not reappearance.
>
> Best regards,
> Minwoo Kang
>


Odd cell result

2018-06-08 Thread Kang Minwoo
Hello, Users

I recently met an unusual situation.
That is the cell result does not contain column family.

I thought the cell is the smallest unit where data could be transferred in 
HBase.
But cell does not contain column family means the cell is not the smallest unit.
I'm wrong?

It occurred in Spark 2.1.2 and did not occur in MR.
And now it is not reappearance.

Best regards,
Minwoo Kang