Hi,

In first snippet of the code following method can be used by providing the
corresponding AuthMethod.

/**
   * Create a user from a login name. It is intended to be used for remote
   * users in RPC, since it won't have any credentials.
   * @param user the full user principal name, must not be empty or null
   * @return the UserGroupInformation for the remote user.
   */
  @InterfaceAudience.Public
  @InterfaceStability.Evolving
  public static UserGroupInformation createRemoteUser(String user,
AuthMethod authMethod) {


This has been added recently to trunk and branch-2. Its not yet available
in any release.

This has been added as fix for HADOOP-10683

Regards,
Vinay


On Wed, Jun 25, 2014 at 3:50 AM, Liu, David <[email protected]> wrote:

> Hi Nauroth,
>
> In this case, do you have any example on how to use java api to read data
> from secured hdfs?
>
> Thanks
>
>
>
>
> On Jun 25, 2014, at 2:24 AM, Chris Nauroth <[email protected]>
> wrote:
>
> > Hi David,
> >
> > UserGroupInformation.createRemoteUser does not attach credentials to the
> > returned ugi.  I expect the server side is rejecting the connection due
> to
> > lack of credentials.  This is actually by design.  The
> > UserGroupInformation.createRemoteUser method is primarily intended for
> use
> > on the server side when it wants to run a piece of its code while
> > impersonating the client.
> >
> > I'd say that your second code sample is the correct one.  After running
> > kinit to get credentials, you can just run your code.  I expect Kerberos
> > authentication to work without taking any special measures to call
> > UserGroupInformation directly from your code.
> >
> > Hope this helps.
> >
> > Chris Nauroth
> > Hortonworks
> > http://hortonworks.com/
> >
> >
> >
> > On Tue, Jun 24, 2014 at 6:29 AM, Liu, David <[email protected]>
> wrote:
> >
> >> Hi experts,
> >>
> >> After kinit hadoop, When I run this java file on a secured hadoop
> cluster,
> >> I met the following error:
> >> 14/06/24 16:53:41 ERROR security.UserGroupInformation:
> >> PriviledgedActionException as:hdfs (auth:SIMPLE)
> >> cause:org.apache.hadoop.security.AccessControlException: Client cannot
> >> authenticate via:[TOKEN, KERBEROS]
> >> 14/06/24 16:53:41 WARN ipc.Client: Exception encountered while
> connecting
> >> to the server : org.apache.hadoop.security.AccessControlException:
> Client
> >> cannot authenticate via:[TOKEN, KERBEROS]
> >> 14/06/24 16:53:41 ERROR security.UserGroupInformation:
> >> PriviledgedActionException as:hdfs (auth:SIMPLE)
> cause:java.io.IOException:
> >> org.apache.hadoop.security.AccessControlException: Client cannot
> >> authenticate via:[TOKEN, KERBEROS]
> >> 14/06/24 16:53:41 ERROR security.UserGroupInformation:
> >> PriviledgedActionException as:hdfs (auth:SIMPLE)
> cause:java.io.IOException:
> >> Failed on local exception: java.io.IOException:
> >> org.apache.hadoop.security.AccessControlException: Client cannot
> >> authenticate via:[TOKEN, KERBEROS]; Host Details : local host is:
> >> "hdsh2-a161/10.62.66.161"; destination host is: "hdsh2-a161.lss.emc.com
> >> ":8020;
> >> Exception in thread "main" java.io.IOException: Failed on local
> exception:
> >> java.io.IOException: org.apache.hadoop.security.AccessControlException:
> >> Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local
> host
> >> is: "hdsh2-a161/10.62.66.161"; destination host is: "
> >> hdsh2-a161.lss.emc.com":8020;
> >>        at
> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> >>        at org.apache.hadoop.ipc.Client.call(Client.java:1351)
> >>        at org.apache.hadoop.ipc.Client.call(Client.java:1300)
> >>        at
> >>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> >>        at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
> >>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>        at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>        at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>        at java.lang.reflect.Method.invoke(Method.java:606)
> >>        at
> >>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
> >>        at
> >>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >>        at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
> >>        at
> >>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:191)
> >>        at
> >>
> org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1067)
> >>        at
> >> org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1057)
> >>        at
> >> org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1047)
> >>        at
> >>
> org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:235)
> >>        at
> >> org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:202)
> >>        at
> >> org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:195)
> >>        at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1215)
> >>        at
> >>
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:290)
> >>        at
> >>
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:286)
> >>        at
> >>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> >>        at
> >>
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:286)
> >>        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:763)
> >>        at Testhdfs$1.run(Testhdfs.java:43)
> >>        at Testhdfs$1.run(Testhdfs.java:30)
> >>        at java.security.AccessController.doPrivileged(Native Method)
> >>        at javax.security.auth.Subject.doAs(Subject.java:415)
> >>        at
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >>        at Testhdfs.main(Testhdfs.java:30)
> >>
> >>
> >> Here is my code:
> >>
> >> UserGroupInformation ugi =
> UserGroupInformation.createRemoteUser("hadoop");
> >>                ugi.doAs(new PrivilegedExceptionAction<Void>() {
> >>                        public Void run() throws Exception {
> >>                                Configuration conf = new Configuration();
> >>                                FileSystem fs =
> >> FileSystem.get(URI.create(uri), conf);
> >>                                FSDataInputStream in = fs.open(new
> >> Path(uri));
> >>                                IOUtils.copy(in, System.out, 4096);
> >>                                return null;
> >>                        }
> >>                });
> >>
> >> But when I run it without UserGroupInformation, like this on the same
> >> cluster with the same user, the code works fine.
> >> Configuration conf = new Configuration();
> >>                                FileSystem fs =
> >> FileSystem.get(URI.create(uri), conf);
> >>                                FSDataInputStream in = fs.open(new
> >> Path(uri));
> >>                                IOUtils.copy(in, System.out, 4096);
> >>
> >> Could anyone help me?
> >>
> >> Thanks
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
>
>


-- 
Regards,
Vinay

Reply via email to