You do not want to expose the kafka instance to your different clients. put
some api endpoint between. rest/grpc or whatever.

2018-03-10 19:01 GMT+01:00 Nick Vasilyev <nick.vasily...@gmail.com>:

> Hard to say without more info, but why not just deploy something like a
> REST api and expose it to your clients, they will send the data to the api
> and it will in turn feed the Kafka  topic.
>
> You will minimize coupling and be able to scale / upgrade easier.
>
> On Mar 10, 2018 2:47 AM, "adrien ruffie" <adriennolar...@hotmail.fr>
> wrote:
>
> > Hello all,
> >
> >
> > in my company we plan to set up the following architecture for our
> client:
> >
> >
> > An internal kafka cluster in our company, and deploy a webapp (our
> > software solution) on premise for our clients.
> >
> >
> > We think to create one producer by "webapp" client in order to push in a
> > global topic (in our kafka) an message which represent an email.
> >
> >
> > The idea behind this, is to unload the client webapp to process several
> > mass mailing operation groups, and treat them ourselves with
> >
> > dedicateds servers into our infrastructure. And each dedicated server
> will
> > be a topic's consumer where the message(email) will be streamed.
> >
> >
> > My main question is, do you think, that each client can be a producer ?
> > (if we have for example 200/300 clients ?)
> >
> > Second question, each client should be a producer ? 😊
> >
> > Do you have another idea for this subject ?
> >
> >
> > Thank you & best regards.
> >
> >
> > Adrien
> >
>

Reply via email to