Hello,

I am trying to write a C++ pipe application.
My mapper looks like this:

void MyMap::map(HadoopPipes::MapContext& context) {
    assert(pInterface != NULL);    
    //cerr<<"Map - now emitting..."<<endl;
    
    string inputValue = context.getInputValue();
    
    std::map<string, vector<string> > m =
pInterface->getKeyValuePairs(inputValue);
    
    for (std::map<string, vector<string> >::iterator i; i != m.end(); i++) {
        for (int j=0; j<i->second.size(); j++) {
            context.emit(i->first, i->second[j]);
            //cerr<<"Map - now emitting... "<<i->first<<"
"<<i->second[j]<<endl;
        }
    }
    
    //cerr<<"Done emitting"<<endl;
}

and the reducer looks like this:

void MyReduce::reduce(HadoopPipes::ReduceContext& context) {    
    vector<string> values;
    
    while (context.nextValue()) {
        values.push_back(context.getInputValue());
    }
    
    string s = pInterface->reduceMappedValues(context.getInputKey(),
values);
    
    //cerr<<"Reduce - now emitting..."<<endl;
    
    context.emit(context.getInputKey(), s);
}

I know for sure that pInterface->getKeyValuePairs(inputValue) ends
gracefully and returns the needed map
in the map() function - however, the loop that does context.emit in map() is
never called (at least the cerr's that were uncommented at first were never
displayed). I thought at first that I may be interferring with the pipe with
the cerr, so I removed the cerr's. Still, I get the same error:

07/11/26 22:21:01 INFO mapred.JobClient:  map 50% reduce 0%
07/11/26 22:21:02 INFO mapred.JobClient:  map 75% reduce 0%
07/11/26 22:21:02 INFO mapred.JobClient: Task Id :
task_200711161538_5478_m_000000_0, Status : FAILED
java.io.IOException: pipe child exception
        at
org.apache.hadoop.mapred.pipes.Application.abort(Application.java:134)       
at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:83)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
        at
org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760)Caused
by: java.io.EOFException
        at java.io.DataInputStream.readByte(DataInputStream.java:243)
        at
org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:313)
        at
org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:335)
        at
org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:112)

07/11/26 22:21:07 INFO mapred.JobClient: Task Id :
task_200711161538_5478_m_000000_1, Status : FAILED

I am not even sure what this error means.

Any ideas what could that mean (the whole Java error thing)? Also, am I
allowed to use cerr in map/reduce? (I can see the displayed messages, but I
am not sure it interferes with the pipe).

Thanks,

Jerr.

-- 
View this message in context: 
http://www.nabble.com/C%2B%2B-pipe-application-error-tf4879558.html#a13964042
Sent from the Hadoop Users mailing list archive at Nabble.com.

Reply via email to