Dear Mr Jonathan,
I've patched code with 720.patch and run SSTableExport and get error:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.cassandra.db.ColumnSerializer.deserialize(ColumnSerializer.java:84)
at
It turns out that it's not just a corrupt row -- the second half of
the Data file is overwritten with index entries instead of actual
data.
I'll track progress in https://issues.apache.org/jira/browse/CASSANDRA-720.
-Jonathan
On Sun, Jan 17, 2010 at 10:30 PM, Jonathan Ellis jbel...@gmail.com
The row size data is incorrect, so there's no way to recover using
just the data file. It can be done by using the redundant information
in the index, though. Should get that done tomorrow.
On Thu, Jan 14, 2010 at 9:35 PM, Jonathan Ellis jbel...@gmail.com wrote:
I am working on a patch for
I am working on a patch for you.
On Thu, Jan 14, 2010 at 9:21 PM, JKnight JKnight beukni...@gmail.com wrote:
Dear all,
This is my data model
Keyspace Name=FeedUsers
ColumnFamily CompareWith=BytesType Name=FeedUsersHome /
/Keyspace
Could you help me to detect problem?
Thank a
What is your CF definition in your config file?
On Sun, Jan 10, 2010 at 7:59 PM, JKnight JKnight beukni...@gmail.com wrote:
The attachment contains data that raise error in compact step.
Could you help me to detect the problem?
Dear Mr Jonathan,
Did you get the attachment?
On Sun, Jan 10, 2010 at 8:59 PM, JKnight JKnight beukni...@gmail.comwrote:
The attachment contains data that raise error in compact step.
Could you help me to detect the problem?
On Fri, Jan 8, 2010 at 3:09 PM, Jonathan Ellis jbel...@gmail.com
Dear Mr Jonathan,
With the larger sstable, I don't have any problem. So I think that the error
does not related to the heap size. And my data model does not use
SuperColumn, so I think the the number of columns in row is not the
problem.
I have tried to delete error row and accept data lost.
On
Can you gzip the sstable that OOMs and send it to me off-list?
On Fri, Jan 8, 2010 at 11:26 AM, JKnight JKnight beukni...@gmail.com wrote:
Dear Mr Jonathan,
With the larger sstable, I don't have any problem. So I think that the error
does not related to the heap size. And my data model does
Dear all,
In compact step, I found the error in file SSTableScanner.java at the
following method
public IteratingRow next()
{
try
{
if (row != null)
row.skipRemaining();
assert !file.isEOF();
do you get any errors when running sstable2json on the files being compacted?
On Thu, Jan 7, 2010 at 3:49 AM, JKnight JKnight beukni...@gmail.com wrote:
Dear all,
In compact step, I found the error in file SSTableScanner.java at the
following method
public IteratingRow next()
Yes. The error is *ERROR: Out of memory deserializing row 2829049. *
I've tried to delete this row immediately. But now Cassandra does not
support delete immediately. Maybe I have to code it.
On Thu, Jan 7, 2010 at 12:28 PM, Jonathan Ellis jbel...@gmail.com wrote:
do you get any errors when
How many columns do you have in your rows? How big a heap are you
giving to sstable2json?
On Thu, Jan 7, 2010 at 9:37 PM, JKnight JKnight beukni...@gmail.com wrote:
Yes. The error is ERROR: Out of memory deserializing row 2829049.
I've tried to delete this row immediately. But now Cassandra
12 matches
Mail list logo