Re: Calrification on Spark-Hadoop Configuration

2015-10-01 Thread Sabarish Sasidharan
You can point to your custom HADOOP_CONF_DIR in your spark-env.sh

Regards
Sab
On 01-Oct-2015 5:22 pm, "Vinoth Sankar"  wrote:

> Hi,
>
> I'm new to Spark. For my application I need to overwrite Hadoop
> configurations (Can't change Configurations in Hadoop as it might affect my
> regular HDFS), so that Namenode IPs gets automatically resolved.What are
> the ways to do so. I tried giving "spark.hadoop.dfs.ha.namenodes.nn",
> "spark.hadoop.dfs.namenode.rpc-address.nn",
> "spark.hadoop.dfs.namenode.http-address.nn" and other core-site & hdfs-site
> conf properties in SparkConf Object. But still i get UnknownHostException.
>
> Regards
> Vinoth Sankar
>


Calrification on Spark-Hadoop Configuration

2015-10-01 Thread Vinoth Sankar
Hi,

I'm new to Spark. For my application I need to overwrite Hadoop
configurations (Can't change Configurations in Hadoop as it might affect my
regular HDFS), so that Namenode IPs gets automatically resolved.What are
the ways to do so. I tried giving "spark.hadoop.dfs.ha.namenodes.nn",
"spark.hadoop.dfs.namenode.rpc-address.nn",
"spark.hadoop.dfs.namenode.http-address.nn" and other core-site & hdfs-site
conf properties in SparkConf Object. But still i get UnknownHostException.

Regards
Vinoth Sankar