strange json2sstable cast exception

2011-08-06 Thread Dan Kuebrich
Having run into a recurring compaction problem due to a corrupt sstable
(perceived row size was 13 petabytes or something), I sstable2json -x 'd
 the key and am now trying to re-import the sstable without it.  However,
I'm running into the following exception:

Importing 2882 keys...
java.lang.ClassCastException: org.apache.cassandra.db.ExpiringColumn cannot
be cast to org.apache.cassandra.db.SuperColumn
at
org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:363)
 at
org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:347)
at
org.apache.cassandra.db.ColumnFamilySerializer.serializeForSSTable(ColumnFamilySerializer.java:88)
 at
org.apache.cassandra.db.ColumnFamilySerializer.serializeWithIndexes(ColumnFamilySerializer.java:107)
at
org.apache.cassandra.io.sstable.SSTableWriter.append(SSTableWriter.java:147)
 at
org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:290)
at
org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:252)
 at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:476)
ERROR: org.apache.cassandra.db.ExpiringColumn cannot be cast to
org.apache.cassandra.db.SuperColumn

The CF is a SuperColumnFamily, if that's relevant.

1. What should I do about this problem?

2. (somewhat unrelated) Our usage of this SCF has moved away from requiring
super-ness.  Aside from missing out on the potential for future seconary
indexes, are we suffering any sort of operational/performance hit from this
classification?


Re: strange json2sstable cast exception

2011-08-06 Thread Jonathan Ellis
You should probably upgrade, it looks like you have a version that
doesn't support sstable2json with expiring columns.

On Sat, Aug 6, 2011 at 9:29 AM, Dan Kuebrich dan.kuebr...@gmail.com wrote:
 Having run into a recurring compaction problem due to a corrupt sstable
 (perceived row size was 13 petabytes or something), I sstable2json -x 'd
  the key and am now trying to re-import the sstable without it.  However,
 I'm running into the following exception:
 Importing 2882 keys...
 java.lang.ClassCastException: org.apache.cassandra.db.ExpiringColumn cannot
 be cast to org.apache.cassandra.db.SuperColumn
 at
 org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:363)
 at
 org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:347)
 at
 org.apache.cassandra.db.ColumnFamilySerializer.serializeForSSTable(ColumnFamilySerializer.java:88)
 at
 org.apache.cassandra.db.ColumnFamilySerializer.serializeWithIndexes(ColumnFamilySerializer.java:107)
 at
 org.apache.cassandra.io.sstable.SSTableWriter.append(SSTableWriter.java:147)
 at
 org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:290)
 at
 org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:252)
 at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:476)
 ERROR: org.apache.cassandra.db.ExpiringColumn cannot be cast to
 org.apache.cassandra.db.SuperColumn
 The CF is a SuperColumnFamily, if that's relevant.
 1. What should I do about this problem?
 2. (somewhat unrelated) Our usage of this SCF has moved away from requiring
 super-ness.  Aside from missing out on the potential for future seconary
 indexes, are we suffering any sort of operational/performance hit from this
 classification?



-- 
Jonathan Ellis
Project Chair, Apache Cassandra
co-founder of DataStax, the source for professional Cassandra support
http://www.datastax.com


Re: strange json2sstable cast exception

2011-08-06 Thread Dan Kuebrich
Forgot to mention node is: new install of 0.8.2, though data was streamed
over from nodes that have been upgraded over time from 0.7.
On Aug 6, 2011 10:47 AM, Jonathan Ellis jbel...@gmail.com wrote:
 You should probably upgrade, it looks like you have a version that
 doesn't support sstable2json with expiring columns.

 On Sat, Aug 6, 2011 at 9:29 AM, Dan Kuebrich dan.kuebr...@gmail.com
wrote:
 Having run into a recurring compaction problem due to a corrupt sstable
 (perceived row size was 13 petabytes or something), I sstable2json -x 'd
  the key and am now trying to re-import the sstable without it.  However,
 I'm running into the following exception:
 Importing 2882 keys...
 java.lang.ClassCastException: org.apache.cassandra.db.ExpiringColumn
cannot
 be cast to org.apache.cassandra.db.SuperColumn
 at

org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:363)
 at

org.apache.cassandra.db.SuperColumnSerializer.serialize(SuperColumn.java:347)
 at

org.apache.cassandra.db.ColumnFamilySerializer.serializeForSSTable(ColumnFamilySerializer.java:88)
 at

org.apache.cassandra.db.ColumnFamilySerializer.serializeWithIndexes(ColumnFamilySerializer.java:107)
 at

org.apache.cassandra.io.sstable.SSTableWriter.append(SSTableWriter.java:147)
 at

org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:290)
 at

org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:252)
 at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:476)
 ERROR: org.apache.cassandra.db.ExpiringColumn cannot be cast to
 org.apache.cassandra.db.SuperColumn
 The CF is a SuperColumnFamily, if that's relevant.
 1. What should I do about this problem?
 2. (somewhat unrelated) Our usage of this SCF has moved away from
requiring
 super-ness.  Aside from missing out on the potential for future
seconary
 indexes, are we suffering any sort of operational/performance hit from
this
 classification?



 --
 Jonathan Ellis
 Project Chair, Apache Cassandra
 co-founder of DataStax, the source for professional Cassandra support
 http://www.datastax.com