Hi
 
I have a question regarding the use of Storm with a cache. Hopefully someone 
can help me.
 
I have a Kafka spout which reads JSON messages from a Kafka topic. The messages 
contain an ID, a timestamp and "start" or "end".
The next step is a bolt, which stores the "start" messages in Redis and a bolt, 
which deletes the "end" messages from Redis. Redis should be used as a cache, 
to have an overview over which programs are currently running.
 
My problem is the following:
I would like to pull all the stored objects from Redis every minute or if 
something has changed. I am using Redis as cache to have an overview over all 
my started (and not yet ended) programs. I have to join the objects with 
further data and to do an aggregation. It is very important that this is 
repeated every minute or as soon as a value in Redis has changed.
How can I do that within Storm?
Do I need a further spout, which pulls the data from Redis? But how can I 
ensure that this is repeated with every change or every minute?
Or do I need a windowed bolt or a tick tuple for this?
Is this even possible with Redis or would you recommend me to use other tools?
 
Thank you so much in advance!
 
Regards,
Daniela

Reply via email to