DefaultCodec works. Is there an existing bug report about the problem with
GZipCodec? If not, I'll create one to track the problem.


> -----Original Message-----
> From: Owen O'Malley [mailto:[EMAIL PROTECTED]
> Sent: Monday, October 23, 2006 11:24 AM
> To: [email protected]
> Subject: Re: Error in reading block compressed sequence file
> 
> Arun's attempt to send this bounced, so I'll try...
> 
>    -- Owen
> 
> > From: Arun C Murthy <[EMAIL PROTECTED]>
> > Date: Mon, 23 Oct 2006 23:41:30 +0530
> > Subject: Re: Error in reading block compressed sequence file
> > To: [email protected]
> >
> > On Sun, Oct 22, 2006 at 04:31:23PM -0700, Runping Qi wrote:
> >
> >>
> >>
> >> I got an error when I tried to read  a blocked compressed sequence
> >> file I
> >> just created. Anybody can offer any clues?
> >>
> >> The file was created through a SequenceFile.Writer object:
> >>
> >> The writer was created as follows:
> >>
> >>                        Configuration conf = new Configuration();
> >>                        FileSystem fs = FileSystem.getNamed
> >> ("local", conf);
> >>                        SequenceFile.Writer theWriter =
> >> SequenceFile.createWriter(fs, conf, new Path(outP), Text.class,
> >> ULTRecordJT.class,
> >>                        CompressionType.BLOCK, new GzipCodec());
> >>
> >                                                 ^^^^^^^^^^^^^^^^
> >
> >   You are using the GzipCodec which (for now, should change with
> > HADOOP-538) won't work with SequenceFiles. Can you replace that
> > with org.apache.hadoop.io.compress.DefaultCodec and give it a whirl?
> >
> >   The gzip basically is the zlib+headers, while the DefaultCodec is
> > zlib only... so compression should be identical but for the headers.
> >
> > thanks,
> > Arun


Reply via email to