Hi All,

really newbie question here folks, i have properties like my aws access and
secret keys in the core-site.xml in hadoop among other properties, but
thats the only reason I have hadoop installed which seems a bit of an
overkill.

Is there an equivalent of core-site.xml for spark so I dont have to
reference the HADOOP_CONF_DIR in my spark env.sh?

I know I can export env variables for the AWS credentials but other
properties that my application might want to use?

Regards
Sam

Reply via email to