I feel silly, I found the memory_limit setting and that fixed my issue.

---------- Forwarded message ----------
From: M GS <[email protected]>
Date: Wed, Apr 8, 2015 at 3:07 PM
Subject: Sandbox Decoder, not enough memory
To: [email protected]


I am trying to create stats based about 2mb json response from
elasticsearch. Our goal is to be able to graph and alert on shard state
changes.

I have an HTTPInput that queries /_cluster/state, but when im using my
decoder to try to extract the state of each shard, my plugin encounters a
fatal error:

Decoder 'ESClusterStatusInput-ESClusterStatusDecoder' error: FATAL:
process_message() not enough memory

Below is my code:

    local resp = read_message("Payload")
    local data = cjson.decode(resp)  -- it seems like here is where the
error is thrown

    for _, index in pairs(data.routing_table.indices) do
        for _, shards in pairs(index.shards) do
            for _, shard in pairs(shards) do
                msg.Fields[shard.state] = msg.Fields[shard.state]+1
            end
        end
    end

I tried with the same data outside of heka and it works, so I am assuming
its a limit imposed on SandboxDecoders. My question is that since what im
doing is probably pretty inefficient, is it even a good idea to be doing
this kind of thing in general? Should I be doing this in go instead?


Thanks!
_______________________________________________
Heka mailing list
[email protected]
https://mail.mozilla.org/listinfo/heka

Reply via email to