I tried running the below command but got the below error. I have not put it into HDFS since Lustre is what I am trying to implement with.
[code] #bin/hadoop jar hadoop-examples-1.1.1.jar wordcount /user/hadoop/hadoop /user/hadoop-output 13/02/17 17:02:50 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/02/17 17:02:50 INFO input.FileInputFormat: Total input paths to process : 1 13/02/17 17:02:50 WARN snappy.LoadSnappy: Snappy native library not loaded 13/02/17 17:02:50 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/ hadoop-hadoop/mapred/staging/root/.staging/job_201302161113_0004 13/02/17 17:02:50 ERROR security.UserGroupInformation: PriviledgedActionExceptio n as:root cause:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java .io.FileNotFoundException: File file:/tmp/hadoop-hadoop/mapred/staging/root/.sta ging/job_201302161113_0004/job.xml does not exist. at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3731) at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3695) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1136) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387) Caused by: java.io.FileNotFoundException: File file:/tmp/hadoop-hadoop/mapred/st aging/root/.staging/job_201302161113_0004/job.xml does not exist. at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSys tem.java:397) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem. java:251) at org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:406) at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3729) ... 12 more org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.FileNotFound Exception: File file:/tmp/hadoop-hadoop/mapred/staging/root/.staging/job_2013021 61113_0004/job.xml does not exist. at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3731) at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3695) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1136) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387) Caused by: java.io.FileNotFoundException: File file:/tmp/hadoop-hadoop/mapred/st aging/root/.staging/job_201302161113_0004/job.xml does not exist. at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSys tem.java:397) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem. java:251) at org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:406) at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3729) ... 12 more at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) at org.apache.hadoop.mapred.$Proxy1.submitJob(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI nvocationHandler.java:85) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat ionHandler.java:62) at org.apache.hadoop.mapred.$Proxy1.submitJob(Unknown Source) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1136) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:9 12) at org.apache.hadoop.mapreduce.Job.submit(Job.java:500) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) at org.apache.hadoop.examples.WordCount.main(WordCount.java:67) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(Progra mDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) [root@alpha hadoop]# [/code] On 2/17/13, linux freaker <linuxfrea...@gmail.com> wrote: > Hello, > > I have 4 machines - 1 MDS, 1 OSS, 2 Linux client. I need to run Hadoop > over lustre replacing HDFS. All I have put the setup detail under > http://paste.ubuntu.com/1661235/ > > All I need to know is what I really need for Hadoop, what > configuration changes are needed? > Please suggest. > _______________________________________________ Lustre-discuss mailing list Lustre-discuss@lists.lustre.org http://lists.lustre.org/mailman/listinfo/lustre-discuss