I have a 9 node cluster I had shut down (cassandra stopped on all nodes, all
nodes shutdown) that I just tried to start back up. I have done this several
times successfully. However, on this attempt, one of the nodes failed to join
the cluster. Upon inspection of /var/log/cassandra/system.log,
I am bulk importing a large number of sstables that I pre-generated using the
bulk load process outlined at
https://github.com/yukim/cassandra-bulkload-example
I am using the 'sstableloader' utility to import them into a nine node
Cassandra cluster.
During the sstableloader execution, I
I'm following the example here for doing a bulk import into Cassandra:
https://github.com/yukim/cassandra-bulkload-example
Is there a way to get a number of rows written to a sstable set created via
CQLSSTableWriter, without importing the sstable set into Cassandra?
I'd like to do some QA on
I'm following the Cassandra bulk import example here:
https://github.com/yukim/cassandra-bulkload-example
Are the Cassandra data types inet, smallint, and tinyint supported by the bulk
import CQLSSTableWriter ?
I can't seem to get them to work...
I received a grant to do some analysis on netflow data (Local IP address, Local
Port, Remote IP address, Remote Port, time, # of packets, etc) using Cassandra
and Spark. The de-normalized data set is about 13TB out the door. I plan on
using 9 Cassandra nodes (replication factor=3) to store the