Hi Xiaomeng,

I noticed that the key was available in the hive trunk as well as apache
hive 1.1.0 release. I realized that the hiveserver2 was taking the old hive
13.1 jars and causing the problem. I am building 1.5.0-incubating-SNAPSHOT on
my own and built it successfully to and trying to integrate with latest
Apache Hive. As Prasad has mentioned to me yesterday, latest release of
Sentry and Apache hive should work.

If you could help, I have a few questions:

1. Can I configure any Apache Hive release like 13.1 or latest hive trunk
to use the old file based sentry-provider.ini ( I think it was the way in
Sentry 1.2 ) instead of using as Latest hive 1.5 service?

2. I see that metastore message

15/03/10 09:32:33 WARN conf.HiveConf: HiveConf of name hive.sentry.conf.url
does not exist

15/03/10 09:32:33 WARN conf.HiveConf: HiveConf of name hive.server2.enable.
impersonation does not exist

The properties are there in hive-site.xml then why is this message coming?


3. This is the content of sentry related configuration is hive-site.xml

<property> <name>hive.server2.enable.impersonation</name> <description>Enable
user impersonation for HiveServer2</description> <value>true</value>
</property><property> <name> sentry.hive.provider.backend </name> <value>
org.apache.sentry.provider.file.SimpleFileProviderBackend </value>
<description> The privilege provider to be used

(either file-based or db-based). </description> </property>

<property> <name>hive.sentry.conf.url</name> <value>
file:///etc/sentry/conf/sentry-site.xml</value> </property>

<property> <name>hive.metastore.pre.event.listeners</name> <value>
org.apache.sentry.binding.metastore.MetastoreAuthzBinding</value>
</property>

<property> <name>hive.metastore.event.listeners</name> <value>
org.apache.sentry.binding.metastore.SentryMetastorePostEventListener</value>
</property>

<property> <name>hive.server2.session.hook</name> <value>
org.apache.sentry.binding.hive.HiveAuthzBindingSessionHook</value>
</property>

<property> <name>hive.security.authorization.task.factory</name> <value>
org.apache.sentry.binding.hive.SentryHiveAuthorizationTaskFactoryImpl
</value> </property>

<property> <name>hive.server2.enable.doAs</name> <value>false</value>
</property>


4. I am getting this error while running "show databases" or any command in
beeline.


2015-03-10 09:59:16,173 WARN  [HiveServer2-Handler-Pool: Thread-21]:
conf.HiveAuthzConf (HiveAuthzConf.java:get(206)) - Using the deprecated
config setting hive.sentry.server instead of sentry.hive.server

2015-03-10 09:59:16,174 WARN  [HiveServer2-Handler-Pool: Thread-21]:
conf.HiveAuthzConf (HiveAuthzConf.java:get(206)) - Using the deprecated
config setting hive.sentry.provider instead of sentry.provider

2015-03-10 09:59:16,229 ERROR [HiveServer2-Handler-Pool: Thread-21]:
ql.Driver (SessionState.java:printError(861)) - FAILED:
InvocationTargetException null

java.lang.reflect.InvocationTargetException

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)

        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

        at
org.apache.sentry.binding.hive.authz.HiveAuthzBinding.getAuthProvider(HiveAuthzBinding.java:205)

        at
org.apache.sentry.binding.hive.authz.HiveAuthzBinding.<init>(HiveAuthzBinding.java:87)

        at
org.apache.sentry.binding.hive.authz.HiveAuthzBinding.<init>(HiveAuthzBinding.java:79)

        at
org.apache.sentry.binding.hive.HiveAuthzBindingHook.<init>(HiveAuthzBindingHook.java:97)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)

        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

        at java.lang.Class.newInstance(Class.java:379)

        at
org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:60)

        at org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1297)

        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:407)

        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307)

        at
org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112)

        at
org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1106)

        at
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:101)

        at
org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:172)

        at
org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)

        at
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:379
)

        at
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:366)

        at
org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:271)

        at
org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:415)

        at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)

        at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)

        at
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)

        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)

        at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:692)

        at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

        at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.reflect.UndeclaredThrowableException

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1563)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.open(SentryPolicyServiceClientDefaultImpl.java:104)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl.<init>(SentryPolicyServiceClientDefaultImpl.java:156)

        at
org.apache.sentry.service.thrift.SentryServiceClientFactory.create(SentryServiceClientFactory.java:42)

        at
org.apache.sentry.provider.db.SimpleDBProviderBackend.<init>(SimpleDBProviderBackend.java:53)

        at
org.apache.sentry.provider.db.SimpleDBProviderBackend.<init>(SimpleDBProviderBackend.java:49)

        ... 35 more

Caused by: org.apache.thrift.transport.TTransportException: Peer indicated
failure: Problem with callback handler

        at
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)

        at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:307)

        at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.baseOpen(SentryPolicyServiceClientDefaultImpl.java:120)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.access$000(SentryPolicyServiceClientDefaultImpl.java:79)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport$1.run(SentryPolicyServiceClientDefaultImpl.java:106)

        at
org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport$1.run(SentryPolicyServiceClientDefaultImpl.java:104)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:415)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)

        ... 40 more


2015-03-10 09:59:16,229 INFO  [HiveServer2-Handler-Pool: Thread-21]:
log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile
start=1425981555715 end=1425981556229 duration=514
from=org.apache.hadoop.hive.ql.Driver>

2015-03-10 09:59:16,229 INFO  [HiveServer2-Handler-Pool: Thread-21]:
log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>




On Tue, Mar 10, 2015 at 3:02 AM, Huang, Xiaomeng <[email protected]>
wrote:

> Hi Vivek,
> Did you set value "
> org.apache.sentry.binding.hive.SentryHiveAuthorizationTaskFactoryImpl " for
> the key "hive.security.authorization.task.factory" in hive conf?
> If not, you should set this conf to use sentry task.
> And what’s your version of sentry?
> As far as I know, there is no release version of sentry to support apache
> hive.
> Current code base 1.5.0-incubating-SNAPSHOT in trunk could support apache
> hive, but you may need to build it by yourself.
>
> Thanks,
> Xiaomeng
>
>
> -----Original Message-----
> From: Vivek Shrivastava [mailto:[email protected]]
> Sent: Tuesday, March 10, 2015 2:52 PM
> To: dev
> Subject: Re: Unable to run latest Sentry build with Hive 1.1.0
>
> Ah.. never mind. I see this is defined in the hive trunk.
>
> On Tue, Mar 10, 2015 at 2:33 AM, Vivek Shrivastava <
> [email protected]
> > wrote:
>
> > Hi,
> >
> > When I run the beeline command I get this error message. Do I have
> > pass the value through command line? if so then what is the value?
> >
> > Thanks,
> >
> > Vivek
> >
> > 2015-03-10 06:18:30,157 ERROR [pool-7-thread-1]:
> > thrift.ProcessFunction
> > (ProcessFunction.java:process(41)) - Internal error processing
> > OpenSession
> >
> > java.lang.NoSuchFieldError: HIVE_AUTHORIZATION_TASK_FACTORY
> >
> >         at
> > org.apache.sentry.binding.hive.HiveAuthzBindingSessionHook.<clinit>(Hi
> > veAuthzBindingSessionHook.java:46)
> >
> >         at java.lang.Class.forName0(Native Method)
> >
> >         at java.lang.Class.forName(Class.java:274)
> >
> >         at
> > org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:59)
> >
> >         at
> > org.apache.hive.service.cli.session.SessionManager.executeSessionHooks
> > (SessionManager.java:223)
> >
> >         at
> > org.apache.hive.service.cli.session.SessionManager.openSession(Session
> > Manager.java:136)
> >
> >         at
> > org.apache.hive.service.cli.CLIService.openSession(CLIService.java:153
> > )
> >
> >         at
> > org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(T
> > hriftCLIService.java:268)
> >
> >         at
> > org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(Thrift
> > CLIService.java:191)
> >
> >         at
> > org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.g
> > etResult(TCLIService.java:1253)
> >
> >         at
> > org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.g
> > etResult(TCLIService.java:1238)
> >
> >         at
> > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> >
> >         at
> > org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> >
> >         at
> > org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge20S$Server$TUGIAss
> > umingProcessor.process(HadoopThriftAuthBridge20S.java:628)
> >
> >         at
> > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPo
> > olServer.java:206)
> >
> >         at
> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
> > ava:1145)
> >
> >         at
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
> > java:615)
> >
> >         at java.lang.Thread.run(Thread.java:745)
> >
>

Reply via email to