Hello,

I'm trying to put the contents of a regular java Map into a
Hadoop MapFile but get an java.io.EOFException error.

The origin map has document ids and double values
and the destination map should have Text keys and
FloatWritable values.

Extra stacktrace prints:

%java maptest
error in stack: 0
class: java.io.DataInputStream
method: readFully
line: 178
error in stack: 1
class: java.io.DataInputStream
method: readFully
line: 152
error in stack: 2
class: org.apache.hadoop.io.SequenceFile$Reader
method: init
line: 1176
error in stack: 3
class: org.apache.hadoop.io.SequenceFile$Reader
method: <init>
line: 1161
error in stack: 4
class: org.apache.hadoop.io.SequenceFile$Reader
method: <init>
line: 1152
error in stack: 5
class: org.apache.hadoop.io.MapFile$Reader
method: <init>
line: 230
error in stack: 6
class: org.apache.hadoop.io.MapFile$Reader
method: <init>
line: 218
error in stack: 7
class: org.apache.hadoop.mapred.MapFileOutputFormat
method: getReaders
line: 75
error in stack: 8
class: maptest
method: main
line: 42

Any idea what's missing?

Thanks!

Peter W.


code:

import java.util.*;
import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.MapFile;
import org.apache.hadoop.io.FloatWritable;
import org.apache.hadoop.io.MapFile.Writer;
import org.apache.hadoop.mapred.MapFileOutputFormat;

public class maptest
   { // Regular map to HDFS map
   private static MapFile.Writer mfw;

   private static final String[] doc_ids=
      {"id1","id2","id3","idn"};

   private static final String[] doc_scrs=
      {"1.0358971666032004","0.7257738095238095",
       "0.5772310234222495","0.5228012820512821"};

   public static void main(String args[])
      {
      // regular map
      Map m=Collections.synchronizedMap(new HashMap());
      for(int i=0;i<doc_ids.length;i++)
         {
         String ukey=doc_ids[i];
         double dkey=Double.parseDouble(doc_scrs[i]);
         m.put(ukey,new Double(dkey));
         }

       try
          {
          // init
          Configuration c=new Configuration();
          FileSystem f=FileSystem.get(c);
          Path p=new Path("/tmp");

          MapFile.Reader[] rdrs=MapFileOutputFormat.getReaders(f,p,c);
          Class kc=rdrs[0].getKeyClass();
          Class vc=rdrs[0].getValueClass();

          mfw=new MapFile.Writer(c,f,"/tmp",kc,vc);

          // convert to hdfs map
          List keys=new ArrayList(m.keySet());
          Iterator itr=keys.iterator();
          while(itr.hasNext())
             {
             Object key=itr.next();
             String tmp_id=((String)key).trim();
             Double tmp_d=(Double)m.get(key);

mfw.append(new Text(tmp_id),new FloatWritable ((tmp_d).floatValue()));
             }

         mfw.close();
         }
      catch(Throwable e)
         {
         StackTraceElement ste[]=e.getStackTrace();
         for(int i=0;i<ste.length;i++)
            {
            String fn=ste[i].getFileName();
            if(fn==null){}

            String classn=ste[i].getClassName();
            String methodn=ste[i].getMethodName();
            int ln=ste[i].getLineNumber();

            System.out.println("error in stack: "+i);
            System.out.println("class: "+classn);
            System.out.println("method: "+methodn);
            System.out.println("line: "+ln);
            } // for
         }
      } // main

   } // maptest

Reply via email to