Thanks for filing that. Piping tail into producer-shell will also
work, although the producer-shell is only on a per-broker basis. It
would be good to have a high-level producer shell.

Joel

On Thu, Sep 1, 2011 at 1:22 PM, Felix Giguere Villegas
<felix.gigu...@mate1inc.com> wrote:
> Hi :)
>
> Thanks for the reply :) . I have created
> KAFKA-130<https://issues.apache.org/jira/browse/KAFKA-130>
> .
>
> I'm not sure I have respected the proper standard for JIRA issues. Please
> let me know if I haven't.
>
> --
> Felix
>
>
>
> On Thu, Sep 1, 2011 at 4:04 PM, Jun Rao <jun...@gmail.com> wrote:
>
>> Hi, Felix,
>>
>> Currently, we don't have a util to pipe data from a file to a producer. I
>> agree that it's a very convenient tool to have. Could you open a jira for
>> that?
>>
>> Thanks,
>>
>> Jun
>>
>> On Thu, Sep 1, 2011 at 12:20 PM, Felix Giguere Villegas <
>> felix.gigu...@mate1inc.com> wrote:
>>
>> > Hi,
>> >
>> > We are currently evaluating Kafka, so I'm trying to get one of our simple
>> > use case working. I have read the design paper and most of everything
>> else
>> > I
>> > could find about Kafka, but I'm not sure how to set up a producer that
>> > tails
>> > files or that reads from STDIN.
>> >
>> > Do I have to write custom code just for that? It seems like a pretty
>> > simple/common use case, so I would think it already exists out of the
>> box.
>> >
>> > I guess I could pipe a tail command into bin/kafka-producer-shell.sh ??
>> >
>> > This kind of stuff is relatively easy to set up with Flume, but we like
>> the
>> > persistence and replay capabilities of Kafka, which is why we are trying
>> it
>> > out.
>> >
>> > Thanks in advance for helping out a newb ;) !
>> >
>> > --
>> > Felix
>> >
>>
>

Reply via email to