Hi there Amit.

Have you looked at the ResultSender.sendResult() method on the function? You can use sendResult() as often as you like to send chunks of 1000 results. You just have to ensure that you "close" the resultSender by calling sendLast().

As for the streaming result collector... Geode does not have a streaming interface, but you can implement a custom result collector. In this custom result collector you can embed your processing of chunks in the "addResult". This way you can process data as soon as the collector receives it.

The one caveat here is that you have to deal with failure and possible duplicates when the function is marked as HA and it might retry/restart upon detection of failure.

--Udo


On 8/14/17 00:14, Amit Pandey wrote:
Also in Spring Data Geode is it possible to send data as soon as I have a chunk of say 1000/ I know I can specify batch size but I don't see how I can do it like streaming

On Sun, Aug 13, 2017 at 3:08 PM, Amit Pandey <[email protected] <mailto:[email protected]>> wrote:

    Hi All,

    I have a function which can potentially return a very large data sets.

    I want to stream data via the functions. Now the default result
    collector of  Geode collects all the data in one large chunk, This
    might result in very slow operation times. How can I use a
    streaming result collector? Is there any example of it given?

    I am using spring-data-geode so if there is something available
    there that will be great too.

    Regards



Reply via email to