Hadoop command without URI

2016-03-19 Thread Vinodh Nagaraj
Hi All,

I configured Hadoop  2.7.1 on Windows 32 bit.

Core-site.xml
-


fs.defaultFS
hdfs://localhost/


when  i execute the below command,it shows error.

C:\hadoop-2.7.1\hadoop-2.7.1\sbin>*hadoop fs -ls*
ls: `.': No such file or directory

If i execute like below,it shows output.

C:\hadoop-2.7.1\hadoop-2.7.1\sbin>*hadoop fs -ls hdfs://localhost/*


*Do i have to change anything in core-site.xml file ?*


*Any help please.*

Thanks & Regards,
Vinodh.N


Re: Hadoop command without URI

2016-03-19 Thread Namikaze Minato
I don't think you need to change anything, no.

If you do >hadoop fs -ls /

You should have the same output as your hadoop fs -ls hdfs://localhost/

Regards,
LLoyd

On 18 March 2016 at 11:41, Vinodh Nagaraj  wrote:

> Hi All,
>
> I configured Hadoop  2.7.1 on Windows 32 bit.
>
> Core-site.xml
> -
> 
> 
> fs.defaultFS
> hdfs://localhost/
> 
>
> when  i execute the below command,it shows error.
>
> C:\hadoop-2.7.1\hadoop-2.7.1\sbin>*hadoop fs -ls*
> ls: `.': No such file or directory
>
> If i execute like below,it shows output.
>
> C:\hadoop-2.7.1\hadoop-2.7.1\sbin>*hadoop fs -ls hdfs://localhost/*
>
>
> *Do i have to change anything in core-site.xml file ?*
>
>
> *Any help please.*
>
> Thanks & Regards,
> Vinodh.N
>
>


best way to start and build hard core hadoop systems

2016-03-19 Thread prasad gadiraju
Hi all, Can you please let us know "best way to start and build hard core
hadoop systems"?

how to master hadoop stack and get to an expert level in it?

regards
gadiraju


Hadoop Kerberos - Authentication issue IPC Server/Client

2016-03-19 Thread K. N. Ramachandran
Hi,

I have a Kerberos setup with Hadoop (single node cluster) in an Ubuntu
environment (VirtualBox setup).

We are using a variant of a Yarn application and the Client.java in this
variant opens a socket for communicating to the ApplicationMaster and
receiving messages.

Without Kerberos, this works fine. I am currently investigating whether the
entire structure will work with Kerberos too and what code changes would be
necessary. With Kerberos, a problem occurs at the socket connection part
and simply fails with errors outlined in the attached file
(kerbFailure.txt), a snippet of the errors is:
16/03/18 17:18:28 WARN ipc.Client: Exception encountered while connecting
to the server : org.apache.hadoop.security.AccessControlException: Client
cannot authenticate via:[KERBEROS]

Now I have enabled Kerberos authentication on the Hadoop cluster by
following the instructions at:
http://www.cloudera.com/documentation/archive/cdh/4-x/4-3-0/CDH4-Security-Guide/cdh4sg_topic_3.html

Since the stacktrace has references to SASL connection methods, should I
explicitly enable SASL authentication, following the instructions at:
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SecureMode.html
?

My impression was that SASL DataTransfer is optional (only needed if I want
to start Nodes as non-root) and I currently start up the Secure Data Node
as root and set JSVC_HOME, using the scripts in the sbin folder.

I can also verify that both client and server processes return the correct
Kerberos principal when I do:

UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
LOG.info("UGI: " + ugi + ", hasKerb: " + ugi.hasKerberosCredentials());
// outputs: UGI: ram@RAM-VIRTUALBOX (auth:KERBEROS), hasKerb: true

I have hdfs and yarn as separate users. Both have their relevant Kerberos
principals and authenticated through keytabs. My username is added as a
principal too and authenticated with a password. So system startup and Yarn
job submission is fine, but I encounter the error at socket connection as
described before.

Hope this overview helps. Please let me know if you might need more
information.

Thanking You,
K.N.Ramachandran
16/03/18 17:18:28 INFO util.Utilities: Connecting to ApplicationMaster at 
ram-VirtualBox/127.0.1.1:34718
16/03/18 17:18:28 INFO ipc.Client: Connecting to ram-VirtualBox/127.0.1.1:34718
16/03/18 17:18:28 INFO security.SaslRpcClient: Checking SaslClient, 
isClientNull: true
16/03/18 17:18:28 WARN ipc.Client: Exception encountered while connecting to 
the server : org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[KERBEROS]

Caused by: java.io.IOException: Failed on local exception: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[KERBEROS]; Host Details : local host is: "ram-VirtualBox/127.0.1.1"; 
destination host is: "ram-VirtualBox":34718; 
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773)
at org.apache.hadoop.ipc.Client.call(Client.java:1432)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at 
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:242)
... 10 more
Caused by: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[KERBEROS]
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:685)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at 
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:648)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:736)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1494)
at org.apache.hadoop.ipc.Client.call(Client.java:1398)
... 12 more
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[KERBEROS]
at 
org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
at 
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:404)
at 
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558)
at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:373)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:728)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:724)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:723)

How to use keystone v1.0 authentication of Openstack Swift(Oracle storage cloud)?

2016-03-19 Thread 황보동규
Hi there! I write this mail to hear about your helpful comment.

As you know, Current version of Hadoop support Openstack Swift. Oracle storage 
cloud is using REST API that is compatible with Openstack Swift. So I try to 
connect Oracle cloud storage with configuration like this 
page(https://hadoop.apache.org/docs/stable/hadoop-openstack/index.html) But I 
cannot connect with this error message.

Invalid Response: Method POST on 
http://tajotest.storage.oraclecloud.com/auth/v1.0 failed, status code: 302, 
status line: HTTP/1.0 302 Found  POST 
http://tajotest.storage.oraclecloud.com/auth/v1.0 => 302
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.buildException(SwiftRestClient.java:1501)
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.perform(SwiftRestClient.java:1402)
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.authenticate(SwiftRestClient.java:1079)
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.authIfNeeded(SwiftRestClient.java:1298)
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.preRemoteCommand(SwiftRestClient.java:1314)
at 
org.apache.hadoop.fs.swift.http.SwiftRestClient.headRequest(SwiftRestClient.java:1015)
at 
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.stat(SwiftNativeFileSystemStore.java:258)
at 
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.getObjectMetadata(SwiftNativeFileSystemStore.java:213)
at 
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.getObjectMetadata(SwiftNativeFileSystemStore.java:182)
at 
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem.getFileStatus(SwiftNativeFileSystem.java:173)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)


I compare the code of Cyberduck(GUI client of various storage cloud like S3, 
Google cloud, Oracle storage cloud …) with hadoop and I found out that 
Cyberduck include some other method appropriate with each keystone version but 
Hadoop only use AuthPostMethod. It only compatible with keystone v2.0. But, 
Oracle storage cloud use keystone v1.0 so, If my understanding is correct, I 
have no idea how to connect with hadoop.

Please check my understanding.. If It is wrong, Please revise. 
If another method is in hadoop, Can you explain how to connect Oracle storage 
cloud?

Re: unsubscribe

2016-03-19 Thread Corey Nolet
Gerald,

In order to unsubscribe from this lister, you need to send an email to
user-unsubscr...@hadoop.apache.org.

On Wed, Mar 16, 2016 at 4:39 AM, Gerald-G  wrote:

>
>