I'm not a developer of Storm, but I asked this question on [email protected], and no one seemed to know the answer. (Or if they did, they did not see fit to respond.) So I'm going to try here instead. Hopefully you will forgive the intrusion.
I want to make a Storm topology than handles DRPC requests, and I want the requests to be handled concurrently. I.e., if a DRPC request comes into the topology at time T0 that takes a long time to complete (let's say 100 seconds), I don't want this request to block another request that comes in at T0 + 1 sec and will only take 1 second to complete. I.e, I'd like to get the result for the second request at time T0 + 2 sec or T0 + 3 sec or so, and not at time T0 + 101 sec. In my experiments so far, I have not been able to get this to work. In fact, if I send a request to a Storm DRPC topology while it is still working on a previous request, the second request is not even buffered. I just end up receiving an exception when I send the second request. Is there a way to do what I want? Or is Storm DRPC inherently sequential? Thanks for your help, |>oug
