Perfect. Thanks, Taylor. That explains the basics.

So now I'm taking the string and parsing it as json. What should be the
best practice to do it directly in a scheme?

Cheers,

On Tue, May 26, 2015 at 6:12 PM, P. Taylor Goetz <[email protected]> wrote:

> The data coming from Kafka to the Kafka spout is just a byte array
> containing the raw data. To consume it, you need to define a `Scheme`
> implementation that knows how to parse the byte array to produce tuples.
>
> For example, the `StringScheme` class included in storm-kafka just
> converts the byte array to a string and puts that value in the tuple with
> the key “str”:
>
>
> https://github.com/apache/storm/blob/master/external/storm-kafka/src/jvm/storm/kafka/StringScheme.java
>
> -Taylor
>
> On May 22, 2015, at 11:51 AM, Sergio Fernández <[email protected]> wrote:
>
> Hi,
>
> I'm experimenting on feeding the KafkaSpout from another language
> different than Jaba, but I guess I have conceptual error...
>
> From Python I'm sending two values:
>
> producer.send_messages("test", "val1", "val2")
>
> But when from a Java bolt I try to handle it:
>
> execute(Tuple input) {
>   String val1 = input.getString(0);
>   String val2 = input.getString(1);
>   ...
> }
>
> I'm getting a IndexOutOfBoundsException: Index: 1, Size: 1.
>
> I'd appreciate any advise how to correctly send tuples.
>
> Thanks!
>
>
> --
> Sergio Fernández
> Partner Technology Manager
> Redlink GmbH
> m: +43 6602747925
> e: [email protected]
> w: http://redlink.co
>
>
>


-- 
Sergio Fernández
Partner Technology Manager
Redlink GmbH
m: +43 6602747925
e: [email protected]
w: http://redlink.co

Reply via email to