Hello,

I am trying to have a setup as simple as possible for demoin the Hue SQL
Editor with Phoenix.

https://phoenix.apache.org/kafka.html

I looked at demoing via live data from a Kafka topic being indexed in live
into HBase. I am trying to run the PhoenixConsumerTool into this docker
image which is simple:

https://hub.docker.com/r/boostport/hbase-phoenix-all-in-one

HBase is in non distributed mode:

/opt/hbase/bin/hbase --config "$HBASE_CONF_DIR"
org.apache.hadoop.hbase.util.HBaseConfTool hbase.cluster.distributed | head
-n 1
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
false

But the main issues are:

   1. I can't find a ZooKeeper address
   2. All the kafka-consumer-json.properties config properties seem ignored
   by PhoenixConsumerTool


2020-11-14 15:50:37,863 INFO zookeeper.ClientCnxn: Socket error occurred:
localhost/127.0.0.1:2181: Connection refused
2020-11-14 15:50:37,964 WARN zookeeper.ReadOnlyZKClient: 0x1ecee32c to
localhost:2181 failed for get of /hbase/master, code = CONNECTIONLOSS,
retries = 10


2020-11-14 15:49:19,227 WARN consumer.ConsumerConfig: The configuration
zookeeperQuorum = 8370ebd5fde0:2181 was supplied but isn't a known config.
2020-11-14 15:49:19,227 WARN consumer.ConsumerConfig: The configuration
topics = topic1,topic2 was supplied but isn't a known config.
2020-11-14 15:49:19,227 WARN consumer.ConsumerConfig: The configuration
serializer.rowkeyType = uuid was supplied but isn't a known config.
2020-11-14 15:49:19,228 WARN consumer.ConsumerConfig: The configuration
serializer = json was supplied but isn't a known config.
2020-11-14 15:49:19,228 WARN consumer.ConsumerConfig: The configuration ddl
= CREATE TABLE IF NOT EXISTS SAMPLE2(uid VARCHAR NOT NULL,c1 VARCHAR,c2
VARCHAR,c3 VARCHAR CONSTRAINT pk PRIMARY KEY(uid)) was supplied but isn't a
known config.

The command I use:
HADOOP_CLASSPATH=$(/opt/hbase/bin/hbase classpath):/opt/hbase/conf
hadoop-3.2.1/bin/hadoop jar phoenix-kafka-5.0.0-HBase-2.0-minimal.jar
org.apache.phoenix.kafka.consumer.PhoenixConsumerTool
-Dfs.defaultFS=file:/// --file kafka-consumer-json.properties

Would you have some tips on how to debug this better?
Are you aware of an even simpler solution for ingesting live data into
HBase?

Thanks!

Romain

Reply via email to