If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
will pick it up. (Sounds like you're not running YARN, which would
require HADOOP_CONF_DIR.)

Also this is more of a user@ question.

On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin <hussam.ela...@gmail.com> wrote:
> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access and
> secret keys in the core-site.xml in hadoop among other properties, but thats
> the only reason I have hadoop installed which seems a bit of an overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to