Anand,
My guess is that your alternatives setup isn’t complete. At a prompt, as su, run the command ‘alternatives - - config java’. Make sure that the oracle version is listed and is marked as the active one. If it is not, go through the steps to make sure it is. - rd From: Anand Murali [mailto:[email protected]] Sent: Wednesday, April 01, 2015 5:42 AM To: [email protected] Subject: Re: Hadoop 2.6 issue I continue to get the samede error.I export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh) when I echo $JAVA_HOME it shows me the above path but when I $java -version, it gives me openjdk version start-dfs.sh ....... errors out saying JAVA_HOME not set., but echo shows JAVA_HOME. Strange !! Regards, Anand Murali 11/7, 'Anand Vihar', Kandasamy St, Mylapore Chennai - 600 004, India Ph: (044)- 28474593/ 43526162 (voicemail) On Wednesday, April 1, 2015 2:22 PM, Anand Murali <[email protected]> wrote: Ok thanks. Shall do Sent from my iPhone On 01-Apr-2015, at 2:19 pm, Ram Kumar <[email protected]> wrote: Anand, Try Oracle JDK instead of Open JDK. Regards, Ramkumar Bashyam On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali <[email protected]> wrote: Tried export in hadoop-env.sh. Does not work either Anand Murali 11/7, 'Anand Vihar', Kandasamy St, Mylapore Chennai - 600 004, India Ph: (044)- 28474593/ 43526162 (voicemail) On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang <[email protected]> wrote: Try to export JAVA_HOME in hadoop-env.sh Best Regard, Jeff Zhang From: Anand Murali <[email protected]> Reply-To: "[email protected]" <[email protected]>, Anand Murali <[email protected]> Date: Wednesday, April 1, 2015 at 2:28 PM To: "[email protected]" <[email protected]> Subject: Hadoop 2.6 issue Dear All: I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and JAVA_PATH. Please find below error message anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config /home/anand_vihar/hadoop-2.6.0/conf Starting namenodes on [localhost] localhost: Error: JAVA_HOME is not set and could not be found. cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory Starting secondary namenodes [0.0.0.0] 0.0.0.0 <http://0.0.0.0/> : Error: JAVA_HOME is not set and could not be found. anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME /usr/lib/jvm/java-1.7.0-openjdk-amd64 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL /home/anand_vihar/hadoop-2.6.0 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH :/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully. Core-site.xml <?xml version="1.0"?> <!--core-site.xml--> <configuration> <property> <name>fs.default.name <http://fs.default.name/> </name> <value>hdfs://localhost/</value> </property> </configuration> HDFS-site.xml <?xml version="1.0"?> <!-- hdfs-site.xml --> <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration> Mapred-site.xml <?xml version="1.0"?> <!--mapred-site.xml--> <configuration> <property> <name>mapred.job.tracker</name> <value>localhost:8021</value> </property> </configuration> Shall be thankful, if somebody can advise. Regards, Anand Murali 11/7, 'Anand Vihar', Kandasamy St, Mylapore Chennai - 600 004, India Ph: (044)- 28474593/ 43526162 (voicemail)
