Hi, HDFS isn't a "replacement" for Databases, it is a File-System (Hadoop Distributed File System - HDFS).
I suggest reading up on what Hadoop is first. Both the book "Hadoop: The Definitive Guide" by Tom White (O'Reilly) and the mildly dated, developer-focussed article at http://developer.yahoo.com/hadoop/tutorial/are great resources at understanding and getting started with Apache Hadoop, and understanding how it can help in your work. On Sat, Feb 18, 2012 at 11:35 AM, Ajit Deshpande <ajit.deshpa...@usain.com>wrote: > Hi Steve,**** > > I m in process of learning HADOOP.**** > > I have installed HDFS on SUSE-10.**** > > But I m confused what to do next. I searched over Google but not able to > find much about it .**** > > So can anyone suggest what I need to do if I have to implement HDFS for > different databases?**** > > ** ** > > ** ** > > * > Ajit Deshpande > United Software Associates Pvt. Ltd. > Pune, India > www.usain.com***** > > ** ** > > *From:* Stuti Awasthi [mailto:stutiawas...@hcl.com] > *Sent:* Friday, February 17, 2012 10:22 AM > *To:* hdfs-user@hadoop.apache.org; Steve Lewis > *Subject:* RE: facing issues in HDFSProxy**** > > ** ** > > After I commented line “ > sslConf.set("proxy.http.test.listener.addr",conf.get("proxy.http.test.listener.addr"));” > in HDFSProxy.java getting below error :**** > > ** ** > > 2012-02-16 14:59:25,156 WARN org.mortbay.log: > java.lang.NullPointerException**** > > 2012-02-16 14:59:25,156 INFO org.apache.hadoop.http.HttpServer: > HttpServer.start() threw a non Bind IOException**** > > 2012-02-16 14:59:25,162 ERROR org.apache.hadoop.hdfsproxy.HdfsProxy: > java.io.IOException: !JsseListener: java.lang.NullPointerException**** > > at > org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:516) > **** > > at > org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)**** > > at org.apache.hadoop.http.HttpServer.start(HttpServer.java:581)**** > > at org.apache.hadoop.hdfsproxy.HdfsProxy.start(HdfsProxy.java:84)** > ** > > at > org.apache.hadoop.hdfsproxy.HdfsProxy.createHdfsProxy(HdfsProxy.java:135)* > *** > > at org.apache.hadoop.hdfsproxy.HdfsProxy.main(HdfsProxy.java:141)** > ** > > ** ** > > Any ideas how to fix it ?**** > > ** ** > > ** ** > > *From:* Denny Ye [mailto:denny...@gmail.com] > *Sent:* Friday, February 17, 2012 7:40 AM > *To:* hdfs-user@hadoop.apache.org; Steve Lewis > *Subject:* Re: facing issues in HDFSProxy**** > > ** ** > > Sure, it's the temporary code, you can delete that line.**** > > ** ** > > -Regards**** > > Denny Ye**** > > 2012/2/16 Stuti Awasthi <stutiawas...@hcl.com>**** > > Thanks Denny,**** > > **** > > Commenting below line in HDFSProxy.java class and rebuilding it . Will > this help ?**** > > > sslConf.set("proxy.http.test.listener.addr",conf.get("proxy.http.test.listener.addr")); > **** > > **** > > Thanks**** > > **** > > *From:* Denny Ye [mailto:denny...@gmail.com] > *Sent:* Thursday, February 16, 2012 2:27 PM > *To:* hdfs-user@hadoop.apache.org > *Subject:* Re: facing issues in HDFSProxy**** > > **** > > In my local Hadoop version, I saw the temporary code with inexistent > property name.**** > > Hashtable does not accept 'null' as the normal value. **** > > It's the mistake of unit testing**** > > **** > > -Regards**** > > Denny Ye**** > > 2012/2/16 Stuti Awasthi <stutiawas...@hcl.com>**** > > Hi all,**** > > Any pointers for this ?**** > > **** > > *From:* Stuti Awasthi > *Sent:* Wednesday, February 15, 2012 5:53 PM > *To:* hdfs-user@hadoop.apache.org > *Subject:* facing issues in HDFSProxy**** > > **** > > Hi,**** > > I am using Hadoop 1.0.0 and want to try HDFSProxy . Following:**** > > > http://hadoop.apache.org/hdfs/docs/r0.21.0/hdfsproxy.html#Jetty-based+Installation+and+Configuration > **** > > **** > > When I run start-hdfsproxy.sh script I get following error in logs :**** > > ************************************************************/**** > > 2012-02-15 17:32:02,676 INFO org.apache.hadoop.hdfsproxy.HdfsProxy: HDFS > NameNode is at: namenode:54310**** > > 2012-02-15 17:32:02,677 ERROR org.apache.hadoop.hdfsproxy.HdfsProxy: > java.lang.NullPointerException**** > > at java.util.Hashtable.put(Hashtable.java:411)**** > > at java.util.Properties.setProperty(Properties.java:160)**** > > at org.apache.hadoop.conf.Configuration.set(Configuration.java:437) > **** > > at > org.apache.hadoop.hdfsproxy.HdfsProxy.initialize(HdfsProxy.java:63)**** > > at org.apache.hadoop.hdfsproxy.HdfsProxy.<init>(HdfsProxy.java:44)* > *** > > at > org.apache.hadoop.hdfsproxy.HdfsProxy.createHdfsProxy(HdfsProxy.java:135)* > *** > > at org.apache.hadoop.hdfsproxy.HdfsProxy.main(HdfsProxy.java:142)**** > > **** > > After googling around I got this Jira “ > https://issues.apache.org/jira/browse/HADOOP-5432”**** > > Should I apply this patch to fix the issue ?**** > > **** > > Please suggest**** > > **** > > Regards,**** > > *Stuti Awasthi***** > > HCL Comnet Systems and Services Ltd**** > > F-8/9 Basement, Sec-3,Noida.**** > > **** > > **** > ------------------------------ > > ::DISCLAIMER:: > > ----------------------------------------------------------------------------------------------------------------------- > > The contents of this e-mail and any attachment(s) are confidential and > intended for the named recipient(s) only. > It shall not attach any liability on the originator or HCL or its > affiliates. Any views or opinions presented in > this email are solely those of the author and may not necessarily reflect > the opinions of HCL or its affiliates. > Any form of reproduction, dissemination, copying, disclosure, > modification, distribution and / or publication of > this message without the prior written consent of the author of this > e-mail is strictly prohibited. If you have > received this email in error please delete it and notify the sender > immediately. Before opening any mail and > attachments please check them for viruses and defect. > > > ----------------------------------------------------------------------------------------------------------------------- > **** > > **** > > ** ** > -- Harsh J Customer Ops. Engineer Cloudera | http://tiny.cloudera.com/about
<<image001.jpg>>