I'm interested in the same topic (similar use case).
What I think would be nice too (and this has been discussed a bit in the
past on this list), would be to have ssl support within the kafka protocol.
Zookeeper also doesn't support ssl, but at least now, in 0.8, producing
clients no longer reall
Wouldn't it make more sense to do something like an encrypted tunnel
between your core routers in each facility? LIke IPSEC on a GRE tunnel or
something.
This concept would need adjustment for those in the cloud but when you want
to build an encrypted tunnel between a bunch of hosts and a bunch of
Unfortunately 'stunneling everything' is not really possible. Stunnel acts like
a proxy service ... in the sense that the Stunnel client (on your log producer,
or log consumer) has to be explicitly configured to connect to an exact
endpoint (ie, kafka1.mydomain.com:1234) -- or multiple endpoints
I think you are right, even if you did put an ELB in front of kafka, it
would only be used for getting the initial broker list afaik. Producers and
consumers need to be able to talk to each broker directly, and also
consumers need to be able to talk to zookeeper as well to store offsets.
Probably
Hi there... we're currently looking into using Kafka as a pipeline for passing
around log messages. We like its use of Zookeeper for coordination (as we
already make heavy use of Zookeeper at Nextdoor), but I'm running into one big
problem. Everything we do is a) in the cloud, b) secure, and c)