You should put hive-site.xml in SPARK_CONF_DIR, the can not find file bug
is due to a spark bug
https://issues.apache.org/jira/browse/SPARK-18160
https://issues.cloudera.org/browse/LIVY-298
I have 1 workaround for you:
You need to install spark in all the nodes and put hive-site.xml in
I want to know if this is possible. Works great for a single user but in a
multi-user environment, we need more granular control on who can do what.
The readers permission is not useful, because the user cannot execute or
even change the display type
Please share your experience how you are
Hi All,
I have the following code.
val ds = sparkSession.readStream()
.format("kafka")
.option("kafka.bootstrap.servers",bootstrapServers))
.option("subscribe", topicName)
.option("checkpointLocation", hdfsCheckPointDir)