Hi Tim, Yes, I have referred the thread on user impersonation from Tuesday, July 10, 2018 ( https://www.mail-archive.com/user@livy.incubator.apache.org/msg00294.html ) Yes, I have enabled hive context. Please find below configurations set in livy.conf .
livy.keystore = /path/to/server.jks livy.keystore.password = password livy.server.port = 8998 livy.spark.master = yarn livy.spark.deploy-mode = cluster livy.impersonation.enabled = true livy.repl.enable-hive-context = true livy.server.access-control.enabled = false livy.server.auth.type = kerberos livy.server.auth.kerberos.keytab=/path/to/http/keytab livy.server.auth.kerberos.principal=HTTP/<principal> livy.server.launch.kerberos.keytab= /path/to/my_proxy_user/keytab livy.server.launch.kerberos.principal=my_proxy_user/<principal> livy.superusers = my_proxy_user Whenever I try to create a batch, the batch is successfully executing. However, yarn diagnostics is throwing the above YarnException. When I try to create a session, the session is finally ending up in a dead state. with the above-mentioned Yarn Exception in the Yarn Diagnostics. Thanks, Vamsi On Tue, 21 Aug 2018 at 00:49, Harsch, Tim <tim.har...@teradata.com> wrote: > Vamsi, > > It may be helpful if you read the thread on user impersonation from > Tuesday, July 10, 2018. > > > > Do you have hive sql enabled? What related configurations do you have set > in livy.conf ? Are you using kerberos as well? > > > > > > *From: *srungarapu vamsi <srungarapu1...@gmail.com> > *Reply-To: *"user@livy.incubator.apache.org" < > user@livy.incubator.apache.org> > *Date: *Monday, August 20, 2018 at 3:36 AM > *To: *"user@livy.incubator.apache.org" <user@livy.incubator.apache.org> > *Subject: *Impersonation Error > > > > [External Email] > ------------------------------ > > Hi, > > > > We have a secure kerberized CDH cluster. > > I am trying to test impersonation through livy rest sever. > > I have created the proxy user which can impersonate other users in the > while running the spark jobs. > > Here is my core-site.xml for the proxy user "my_proxy_user": > > > > <property> > > <name>hadoop.proxyuser.my_proxy_user.hosts</name> > > <value>*</value> > > </property> > > <property> > > <name>hadoop.proxyuser. my_proxy_user.groups</name> > > <value> my_proxy_user_grp</value> > > </property> > > <property> > > <name>hadoop.proxyuser. my_proxy_user.users</name> > > <value>*</value> > > </property> > > </configuration> > > > > With these configurations, whenever i am trying to create a session or > batch, i am getting the following error. > > > > org.apache.hadoop.yarn.exceptions.YarnException: User my_proxy_user does not > have privilage to see this attempt appattempt_1533784624146_0323_-000001 > > at > org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getApplicationAttemptReport(ClientRMService.java:375) > > at > org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.getApplicationAttemptReport(ApplicationClientProtocolPBServiceImpl.java:355) > > at > org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:425) > > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) > > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) > > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217) > > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:422) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) > > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2211) > > > > I am able to run the spark jobs by passsing the --proxyUser flag. > > Also, i am successfully able to run yarn jobs with HADOOP_PROXY_USER. > > > > But, on trying to create a session or batch, i am not getting in stderr of > the session/batch logs. But "Yarn Diagnostics" show the above error. > > > > Can you please help me in solving this issue. > > -- > > /Vamsi > -- /Vamsi