I am a newbie but analysis of large data sets is my use case. Trying to figure out the best way to do it right now.
Quick question which I could not find in FAQ or guide. If I make config changes (under /usr/local/hadoop/conf) on a hadoop node (say master) how do i restart my hadoop services? On Tue, May 8, 2012 at 3:21 PM, Andrei Savu <[email protected]> wrote: > Awesome! > > How are you using the Hadoop clusters? Are you planning to add custom > services to Whirr? > > -- Andrei > > > On Tue, May 8, 2012 at 10:16 PM, Light Reader <[email protected]>wrote: > >> Slapping forehead. >> >> All good now.Thanks!! >> >> >> On Tue, May 8, 2012 at 3:13 PM, Andrei Savu <[email protected]>wrote: >> >>> I suspect a Hadoop version mismatch. You need to set >>> whirr.hadoop.tarball.url=...hadoop-1.0.1.tar.gz >>> >>> Can you try to login to the namenode and run hadoop fs -ls / ? That >>> should work. >>> >>> -- Andrei Savu >>> >>> >>> On Tue, May 8, 2012 at 8:46 PM, Light Reader <[email protected]>wrote: >>> >>>> Hi, >>>> Looks like I am past some of my older issues. when i do a simple hadoop >>>> fs -ls on my new cluster I get the following error: >>>> >>>> [root@ip-10-118-190-121 WCluster1]# ~/hadoop-1.0.1/bin/hadoop fs -ls / >>>> 12/05/08 17:33:03 WARN conf.Configuration: DEPRECATED: hadoop-site.xml >>>> found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use >>>> core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of >>>> core-default.xml, mapred-default.xml and hdfs-default.xml respectively >>>> Bad connection to FS. command aborted. exception: Call to >>>> ec2-23-22-79-169.compute-1.amazonaws.com/10.122.25.228:8020 failed on >>>> local exception: java.net.SocketException: Broken pipe >>>> >>>> I am guessing this is more of a Hadoop issue but dunno which setting is >>>> causing this to happen. >>>> >>>> Thanks, >>>> Arni >>>> >>> >>> >> >
