Hi Dharmin
With the 1st approach , you will have to read the properties from the
--files using this below:
SparkFiles.get('file.txt')
Or else , you can copy the file to hdfs , read it using sc.textFile and use
the property within it.
If you add files using --files , it gets copied to executor's w
Can it be that you are missing the HBASE_HOME var ?
Jorge Machado
> On 23 Feb 2018, at 04:55, Dharmin Siddesh J wrote:
>
> I am trying to write a Spark program that reads data from HBase and store it
> in DataFrame.
>
> I am able to run it perfectly with hbase-site.xml in the $SPARK_HOM
I am trying to write a Spark program that reads data from HBase and store
it in DataFrame.
I am able to run it perfectly with hbase-site.xml in the $SPARK_HOME/conf
folder, but I am facing few issues here.
Issue 1
The first issue is passing hbase-site.xml location with the --files
parameter subm