Im using kafkacat https://github.com/edenhill/kafkacat to send files (and streaming TCP packets) from a separate server to the kafka server & then storm picks it up. The data cleaning is done by command line (grep/cut/tr). Failed cleaning gets logged to text file.
I tried to see if i can install kafka on the ftp server and then let the main kafka subscribe to the ftp one. I didnt follow through because there was too many moving parts & kafkacat did the job. On Sat, Mar 21, 2015 at 2:47 PM, <[email protected]> wrote: > Hello, > is there a tool which allows you to feed data contained in files on an ftp > server, to Storm (and handles bad connections)? > Cheers, > Philippe > -- Mithun Kalan [email protected]
