Does it depend on shared libraries which are not accessible during
 runtime?
Are you passing all dependents using -file option?

and also, try to run it on one node cluster and debug it as described
 here http://wiki.apache.org/lucene-hadoop/HowToDebugMapReducePrograms

Here is a simple/dirty c program that is like cat

/* Simple program to read stdin and write to stdout */
#include <stdio.h>
int main() {
  char buffer[256];
  while (fgets(buffer, 255, stdin)) {
    fputs(buffer, stdout);
  }
return 0;
}

hadoop jar $HADOOP_HOME/hadoop-streaming.jar -mapper "./mycat" -input
 cinput -output tmp_out -reducer NONE


----- Original Message ----
From: Christian Kremnitzer <[EMAIL PROTECTED]>
To: [email protected]
Sent: Tuesday, October 30, 2007 11:34:07 PM
Subject: Re: Hadoop-Streaming with C


my log stderr - file tells me the following:
/tmp/hadoop-kremnitzer/mapred/local/taskTracker/jobcache/job_200710301154_0053/work/mapper.out:
 
/tmp/hadoop-kremnitzer/mapred/local/taskTracker/jobcache/job_200710301154_0053/work/mapper.out:
 
cannot execute binary file

I only have a /tmp/hadoop-kremnitzer/mapred/local/*jobTracker* folder

Christian




__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com

Reply via email to