>From hadoop doc, only Linux and Windows are supported platforms. Is it possible to run hadoop on Solaris? Is hadoop implemented in pure java? What kinds of problems are there in
order to port to Solaris? Thanks in advance.

hi,

no one seems to reply to the previous "hadoop on Solaris" Thread.

I just tried running hadoop on Solaris 5.10 and somehow got error message. If you can give some advices, I would appreciate it. (single operation seems to
work).


I tried pseudo operation.

First, I have to set the path to /usr/ucb in order to use "whoami".

Then, I set the conf/hadoop-env.sh to set JAVA_HOME.

I set conf/hadoop-site.xml as below
*****************************************
<configuration>
  <property>
    <name>fs.default.name</name>
    <value>localhost:9000</value>
  </property>
  <property>
    <name>mapred.job.tracker</name>
    <value>localhost:9001</value>
  </property>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
    <property>
        <name>dfs.name.dir</name>
        <value>/home/hadoop/hadoop-dist/filesystem/name</value>
    </property>
  <property>
        <name>dfs.data.dir</name>
        <value>/home/hadoop/hadoop-dist/filesystem/data</value>
  </property>
  <property>
        <name>mapred.system.dir</name>
        <value>/home/hadoop/hadoop-dist/filesystem/mapred/system</value>
  </property>
  <property>
    <name>mapred.local.dir</name>
    <value>/home/hadoop/hadoop-dist/filesystem/mapred/local</value>
    </property>
</configuration>
****************************************************

I think the configuration is conventional.

Then, I start the daemons, and all daemons seems to boot without problems.
Also, I can copy the example "conf" directory into HDFS.

I tried "grep" program, but I got the errors below.

*****************errors*******************
> ./bin/hadoop jar hadoop-0.17.0-examples.jar grep test_input test_output 'dfs[a-z.]+' 08/06/17 14:51:06 WARN fs.FileSystem: "localhost:9000" is a deprecated filesystem name. Use "hdfs://localhost:9000/" instead. 08/06/17 14:51:07 WARN fs.FileSystem: "localhost:9000" is a deprecated filesystem name. Use "hdfs://localhost:9000/" instead. 08/06/17 14:51:09 WARN fs.FileSystem: "localhost:9000" is a deprecated filesystem name. Use "hdfs://localhost:9000/" instead. 08/06/17 14:51:09 INFO mapred.FileInputFormat: Total input paths to process : 10 08/06/17 14:51:11 INFO mapred.JobClient: Running job: job_200806171448_0001
08/06/17 14:51:12 INFO mapred.JobClient:  map 0% reduce 0%
08/06/17 14:51:17 INFO mapred.JobClient: Task Id : task_200806171448_0001_m_000000_0, Status : FAILED
Error initializing task_200806171448_0001_m_000000_0:
java.io.IOException
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy stem.java:68) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280) at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632) at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 1274) at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915) at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310) at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251) Caused by: javax.security.auth.login.LoginException: Login failed: whoami: not found at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:250) at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:275)
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
        ... 11 more

08/06/17 14:51:17 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_0&filter=stdout 08/06/17 14:51:17 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_0&filter=stderr 08/06/17 14:51:22 INFO mapred.JobClient: Task Id : task_200806171448_0001_m_000000_1, Status : FAILED
Error initializing task_200806171448_0001_m_000000_1:
java.io.IOException
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy stem.java:68) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280) at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632) at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 1274) at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915) at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310) at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251) Caused by: javax.security.auth.login.LoginException: Login failed: whoami: not found at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:250) at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:275)
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
        ... 11 more

08/06/17 14:51:22 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_1&filter=stdout 08/06/17 14:51:22 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_1&filter=stderr 08/06/17 14:51:22 INFO mapred.JobClient: Task Id : task_200806171448_0001_m_000000_2, Status : FAILED
Error initializing task_200806171448_0001_m_000000_2:
java.io.IOException
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy stem.java:68) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280) at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632) at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 1274) at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915) at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310) at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251) Caused by: javax.security.auth.login.LoginException: Login failed: whoami: not found at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:250) at org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI nformation.java:275)
        at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
        ... 11 more

08/06/17 14:51:22 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_2&filter=stdout 08/06/17 14:51:22 WARN mapred.JobClient: Error reading task outputhttp://MyPC:50060/tasklog? plaintext=true&taskid=task_200806171448_0001_m_000000_2&filter=stderr
08/06/17 14:51:27 INFO mapred.JobClient:  map 100% reduce 100%
08/06/17 14:51:28 WARN fs.FileSystem: "localhost:9000" is a deprecated filesystem name. Use "hdfs://localhost:9000/" instead.
java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1062)
        at org.apache.hadoop.examples.Grep.run(Grep.java:69)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.Grep.main(Grep.java:93)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav a:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor Impl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDr iver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav a:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor Impl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220)
*****************errors*******************

I appreciate your help.

satoshi


------------------------------------------------------------------------ --------
Satoshi YAMADA <[EMAIL PROTECTED]>
Department of Computer Science and Communication
Engineering, Graduate School of Information Science and
Electrical Engineering, Kyushu University

Reply via email to