Thanks, It worked !!!
On Tue, May 24, 2016 at 1:14 AM, Marcelo Vanzin wrote:
> On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani
> wrote:
> > I am passing hive-site.xml through --files option.
>
> You need hive-site-xml in Spark's classpath too. Easiest way is to
> copy / symlink hive-si
On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani
wrote:
> I am passing hive-site.xml through --files option.
You need hive-site-xml in Spark's classpath too. Easiest way is to
copy / symlink hive-site.xml in your Spark's conf directory.
--
Marcelo
--
Thanks Doug,
I have all the 4 configs (mentioned by you) already in my hive-site.xml. Do
I need to create a hive-site.xml in spark conf directory (it is not there
by default in 1.6.1)? Please suggest.
On Mon, May 23, 2016 at 9:53 PM, Doug Balog
wrote:
> I have a custom hive-site.xml for spark
I have a custom hive-site.xml for spark in sparks conf directory.
These properties are the minimal ones that you need for spark, I believe.
hive.metastore.kerberos.principal = copy from your hive-site.xml, i.e.
"hive/_h...@foo.com"
hive.metastore.uris = copy from your hive-site.xml, i.e.
thr
Can you describe the kerberos issues in more detail ?
Which release of YARN are you using ?
Cheers
On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani <
cpbhagt...@gmail.com> wrote:
> Hi,
>
> My Spark job is failing with kerberos issues while creating hive context
> in yarn-cluster mode. H
Hi,
My Spark job is failing with kerberos issues while creating hive context in
yarn-cluster mode. However it is running with yarn-client mode. My spark
version is 1.6.1
I am passing hive-site.xml through --files option.
I tried searching online and found that the same issue is fixed with the
fo