I am trying to write a Spark program that reads data from HBase and store
it in DataFrame.
I am able to run it perfectly with hbase-site.xml in the $SPARK_HOME/conf
folder, but I am facing few issues here.
The first issue is passing hbase-site.xml location with the --files
parameter submitted through client mode (it works in cluster mode).
When I removed hbase-site.xml from $SPARK_HOME/conf and tried to execute it
in client mode by passing with the --files parameter over YARN I keep
getting the an exception (which I think means it is not taking the
ZooKeeper configuration from hbase-site.xml.
--master yarn \
--deploy-mode client \
--files /home/siddesh/hbase-site.xml \
--class com.orzota.rs.json.HbaseConnector \
--packages com.hortonworks:shc:1.0.0-2.0-s_2.11 \
--repositories http://repo.hortonworks.com/content/groups/public/ \
18/02/22 01:43:09 INFO ClientCnxn: Opening socket connection to server
localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL
18/02/22 01:43:09 WARN ClientCnxn: Session 0x0 for server null, unexpected
error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
However it works good when I run it in cluster mode.
Passing the HBase configuration details through the Spark session, which I
can't get to work in both client and cluster mode.
Instead of passing the entire hbase-site.xml I am trying to add the
configuration directly in the code by adding it as a configuration
parameter in the SparkSession, e.g.:
val spark = SparkSession
val json_df =
This is not working in cluster mode either.
Can anyone help me with a solution or explanation why this is happening are
there any workarounds?