I'm using level compaction and I have about 200GB compressed in my
largest CFs. The disks are getting full. This is time-series data so I
want to drop data that is a couple of months old. It's pretty easy for
me to iterate through the relevant keys and delete the rows. But will
that do
On 1/28/2012 9:34 AM, Peter Schuller wrote:
I'm using level compaction and I have about 200GB compressed in my
largest CFs. The disks are getting full. This is time-series data so I
want to drop data that is a couple of months old. It's pretty easy for
me to iterate through the relevant keys
I'm at 80%, so not quite panic yet ;-)
I'm wondering, in the steady state, how much of the space used will
contain deleted data.
That depends entirely on your workload, including:
* How big the data that you are deleting is in relation to the size of
tombstones
* How long the average piece
I'm using level compaction and I have about 200GB compressed in my
largest CFs. The disks are getting full. This is time-series data so I
want to drop data that is a couple of months old. It's pretty easy for
me to iterate through the relevant keys and delete the rows. But will
that do anything?
I