Dear Mr Jonathan,

With the larger sstable, I don't have any problem. So I think that the error
does not related to the heap size. And my data model does not use
SuperColumn, so I think the the number of columns in row is not the
problem.

I have tried to delete error row and accept data lost.

On Fri, Jan 8, 2010 at 11:08 AM, Jonathan Ellis <jbel...@gmail.com> wrote:

> How many columns do you have in your rows?  How big a heap are you
> giving to sstable2json?
>
> On Thu, Jan 7, 2010 at 9:37 PM, JKnight JKnight <beukni...@gmail.com>
> wrote:
> > Yes. The error is ERROR: Out of memory deserializing row 2829049.
> > I've tried to delete this row immediately. But now Cassandra does not
> > support delete immediately. Maybe I have to code it.
> >
> > On Thu, Jan 7, 2010 at 12:28 PM, Jonathan Ellis <jbel...@gmail.com>
> wrote:
> >>
> >> do you get any errors when running sstable2json on the files being
> >> compacted?
> >>
> >> On Thu, Jan 7, 2010 at 3:49 AM, JKnight JKnight <beukni...@gmail.com>
> >> wrote:
> >> > Dear all,
> >> >
> >> > In compact step, I found the error in file SSTableScanner.java at the
> >> > following method
> >> >         public IteratingRow next()
> >> >         {
> >> >             try
> >> >             {
> >> >                 if (row != null)
> >> >                     row.skipRemaining();
> >> >                 assert !file.isEOF();
> >> >                 return row = new IteratingRow(file, sstable);
> >> >             }
> >> >             catch (IOException e)
> >> >             {
> >> >                 logger.debug("IteratingRow next Exception " +
> >> > sstable.getFilename());
> >> >                 throw new RuntimeException(e);
> >> >             }
> >> >         }
> >> > The error is: Caused by: java.lang.RuntimeException:
> >> > java.io.UTFDataFormatException: malformed input around byte 13
> >> >     at
> >> >
> >> >
> org.apache.cassandra.io.SSTableScanner$KeyScanningIterator.next(SSTableScanner.java:120)
> >> >
> >> > Can I fix corrupt file?
> >> > And how I can do?
> >> > I have tried almost Cassandra version and the error still occur.
> >> >
> >> > Thank a lot for support.
> >> >
> >> > --
> >> > Best regards,
> >> > JKnight
> >> >
> >
> >
> >
> > --
> > Best regards,
> > JKnight
> >
>



-- 
Best regards,
JKnight

Reply via email to