Please add this entry to /etc/hosts *127.0.0.1 localhost* Restart namenode or reboot the machine and then restart namenode. If it doesn't work then try formatting the namenode using command "hdfs namenode -format"
On Thu, Sep 18, 2014 at 8:25 PM, Vandana kumari <[email protected]> wrote: > No its not working ravindra and showing following error: > > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > ls: Call From belief/127.0.0.1 to localhost:8020 failed on connection > exception: java.net.ConnectException: Connection refused; For more details > see: http://wiki.apache.org/hadoop/ConnectionRefused > > > > On Thu, Sep 18, 2014 at 4:50 PM, Ravindra <[email protected]> wrote: > >> A quick fix is to run this command >> ln -s /etc/hadoop/conf.pseudo /etc/hadoop/conf >> >> On Thu, Sep 18, 2014 at 3:46 PM, Vandana kumari <[email protected]> >> wrote: >> >>> /etc/hadoop/conf/ is not present. Earlier i tried to install hadoop by >>> apache hadoop and cdh4 installation too, but i had uninstalled both, still >>> unable to figure out the error >>> >>> >>> On Thu, Sep 18, 2014 at 3:41 PM, Ravindra <[email protected]> wrote: >>> >>>> Please check if /etc/hadoop/conf/ is present and is a symbolic link to >>>> some other directory that doesn't exists. >>>> Ideally this should be take care by installer, are you sure that you >>>> didn't have an already existing hadoop setup on that machine? >>>> >>>> Regards, >>>> Ravindra >>>> >>>> On Thu, Sep 18, 2014 at 3:12 PM, Vandana kumari <[email protected] >>>> > wrote: >>>> >>>>> In the installation process /etc/hadoop/conf.pseudo/ directory was >>>>> made which contains all the hdfs files: core-site.xml, hdfs-site.xml, >>>>> mapred-site.xml, >>>>> yarn-site.xml >>>>> >>>>> On Thu, Sep 18, 2014 at 2:53 PM, Ravindra <[email protected]> >>>>> wrote: >>>>> >>>>>> Please check if /etc/hadoop/conf/ exists. >>>>>> If it exists then export enviroment variable HADOOP_CONF_DIR set to >>>>>> this path. >>>>>> >>>>>> On Thu, Sep 18, 2014 at 2:46 PM, Vandana kumari < >>>>>> [email protected]> wrote: >>>>>> >>>>>>> Hello all >>>>>>> I am manually Installing CDH 5 with YARN on a Single Linux Node in >>>>>>> Pseudo-distributed mode on CentOS 6 64 bit. But whenever i am running >>>>>>> any >>>>>>> hdfs command i am getting error: "core-site.xml" not found. >>>>>>> I am using a proxy server. Please help how to solve this problem. >>>>>>> The error file is attached herewith. >>>>>>> >>>>>>> -- >>>>>>> Thanks and regards >>>>>>> Vandana kumari >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> Thanks and regards >>>>> Vandana kumari >>>>> >>>> >>>> >>> >>> >>> -- >>> Thanks and regards >>> Vandana kumari >>> >> >> > > > -- > Thanks and regards > Vandana kumari >
