Re: Phoenix + Spark + JDBC + Kerberos?

2016-09-19 Thread Jean-Marc Spaggiari
Thanks for the pointer to PHOENIX-3189 Josh. I don't think we are facing
that.

We will try to activate the debug mode on Kerberos and retry. Good idea!

I will keep this thread updated if we find something...

JMS

2016-09-15 17:39 GMT-04:00 Josh Elser :

> Cool, thanks for the info, JM. Thinking out loud..
>
> * Could be missing/inaccurate /etc/krb5.conf on the nodes running spark
> tasks
> * Could try setting the Java system property sun.security.krb5.debug=true
> in the Spark executors
> * Could try to set org.apache.hadoop.security=DEBUG in log4j config
>
> Hard to guess at the real issue without knowing more :). Any more context
> you can share, I'd be happy to try to help.
>
> (ps. obligatory warning about PHOENIX-3189 if you're using 4.8.0)
>
> Jean-Marc Spaggiari wrote:
>
>> Using the keytab in the JDBC URL. That the way we use locally and we
>> also tried to run command line applications directly from the worker
>> nodes and it works, But inside the Spark Executor it doesn't...
>>
>> 2016-09-15 13:07 GMT-04:00 Josh Elser > >:
>>
>> How do you expect JDBC on Spark Kerberos authentication to work? Are
>> you using the principal+keytab options in the Phoenix JDBC URL or is
>> Spark itself obtaining a ticket for you (via some "magic")?
>>
>>
>> Jean-Marc Spaggiari wrote:
>>
>> Hi,
>>
>> I tried to build a small app all under Kerberos.
>>
>> JDBC to Phoenix works
>> Client to HBase works
>> Client (puts) on Spark to HBase works.
>> But JDBC on Spark to HBase fails with a message like
>> "GSSException: No
>> valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]"
>>
>> Keytab is accessible on all the nodes.
>>
>> Keytab belongs to the user running the job, and executors are
>> running
>> under that user name. So this is fine.
>>
>> Any idea of that this might be?
>>
>> Thanks,
>>
>> JMS
>>
>>
>>


Re: Phoenix + Spark + JDBC + Kerberos?

2016-09-15 Thread Josh Elser

Cool, thanks for the info, JM. Thinking out loud..

* Could be missing/inaccurate /etc/krb5.conf on the nodes running spark 
tasks
* Could try setting the Java system property 
sun.security.krb5.debug=true in the Spark executors

* Could try to set org.apache.hadoop.security=DEBUG in log4j config

Hard to guess at the real issue without knowing more :). Any more 
context you can share, I'd be happy to try to help.


(ps. obligatory warning about PHOENIX-3189 if you're using 4.8.0)

Jean-Marc Spaggiari wrote:

Using the keytab in the JDBC URL. That the way we use locally and we
also tried to run command line applications directly from the worker
nodes and it works, But inside the Spark Executor it doesn't...

2016-09-15 13:07 GMT-04:00 Josh Elser mailto:josh.el...@gmail.com>>:

How do you expect JDBC on Spark Kerberos authentication to work? Are
you using the principal+keytab options in the Phoenix JDBC URL or is
Spark itself obtaining a ticket for you (via some "magic")?


Jean-Marc Spaggiari wrote:

Hi,

I tried to build a small app all under Kerberos.

JDBC to Phoenix works
Client to HBase works
Client (puts) on Spark to HBase works.
But JDBC on Spark to HBase fails with a message like
"GSSException: No
valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]"

Keytab is accessible on all the nodes.

Keytab belongs to the user running the job, and executors are
running
under that user name. So this is fine.

Any idea of that this might be?

Thanks,

JMS




Re: Phoenix + Spark + JDBC + Kerberos?

2016-09-15 Thread Jean-Marc Spaggiari
Using the keytab in the JDBC URL. That the way we use locally and we also
tried to run command line applications directly from the worker nodes and
it works, But inside the Spark Executor it doesn't...

2016-09-15 13:07 GMT-04:00 Josh Elser :

> How do you expect JDBC on Spark Kerberos authentication to work? Are you
> using the principal+keytab options in the Phoenix JDBC URL or is Spark
> itself obtaining a ticket for you (via some "magic")?
>
>
> Jean-Marc Spaggiari wrote:
>
>> Hi,
>>
>> I tried to build a small app all under Kerberos.
>>
>> JDBC to Phoenix works
>> Client to HBase works
>> Client (puts) on Spark to HBase works.
>> But JDBC on Spark to HBase fails with a message like "GSSException: No
>> valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]"
>>
>> Keytab is accessible on all the nodes.
>>
>> Keytab belongs to the user running the job, and executors are running
>> under that user name. So this is fine.
>>
>> Any idea of that this might be?
>>
>> Thanks,
>>
>> JMS
>>
>


Re: Phoenix + Spark + JDBC + Kerberos?

2016-09-15 Thread Josh Elser
How do you expect JDBC on Spark Kerberos authentication to work? Are you 
using the principal+keytab options in the Phoenix JDBC URL or is Spark 
itself obtaining a ticket for you (via some "magic")?


Jean-Marc Spaggiari wrote:

Hi,

I tried to build a small app all under Kerberos.

JDBC to Phoenix works
Client to HBase works
Client (puts) on Spark to HBase works.
But JDBC on Spark to HBase fails with a message like "GSSException: No
valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]"

Keytab is accessible on all the nodes.

Keytab belongs to the user running the job, and executors are running
under that user name. So this is fine.

Any idea of that this might be?

Thanks,

JMS