Hi all,
I have a requirement to be able to capture and store events for query,
and I'm trying to choose the best option for that:
1) Capture the events from a separate topic, store events a state, in
order to convert a stream to a table, that means materializing the
stream.
The option for it,
Hi everyone,
I trying to send events from the topic
This is the sink json configuration I’ve used:
{
"name":"CPSConnector",
"config":{
"connector.class":"com.google.pubsub.kafka.sink.CloudPubSubSinkConnector",
"tasks.max":"1",
"topics":"STREAM-CUSTOMER-ACCOUNTS",
Hi everyone,
I´m working in the configuration of the topics for the integration between
one API and Data platform system. We have created topic for each entity
that they would need to integrate in to the datawarehouse.
My question and I hope you can help me is, each entity will have diferent
Very interesting and usefull.
Thanks
On 18 January 2018 at 21:58, Damian Guy <damian@gmail.com> wrote:
> This might be a good read for you:
> https://www.confluent.io/blog/put-several-event-types-kafka-topic/
>
> On Thu, 18 Jan 2018 at 20:57 Maria Pilar <pilife...@gmail
a datastore and search throgh the records,
>
> Interactive Queries API makes this very nice.
>
> On Thu, Jan 25, 2018 at 8:47 AM, Maria Pilar <pilife...@gmail.com> wrote:
>
> > Hi everyone,
> >
> > I´m trying to understand the best practice to define the part
Hi everyone,
I´m trying to understand the best practice to define the partition key. I
have defined some topics that they are related with entities in cassandra
data model, the relationship is one-to-one, one entity - one topic, because
I need to ensure the properly ordering in the events. I have
Hi everyone
I have design an integration between 2 systems throug our API Stream Kafka,
and the requirements are unclear to choose properly the number of
partitions/topics.
That is the use case:
My producer will send 28 different type of events, so I have decided to
create 28 topics.
The max
Hi everyone,
I have design a kafka api solution which has define some topics, these
topics are produced by other api connect with Cassandra data model.
So I need to fixed the sensical order and sequential order of the events,
it means i need to guarantee that create, update and delete events will
Hi everyone
I'm designing a control handling for my kafka stream api.
I'm would like to know any documentation or best practise that I can read.
Basically I'm creating some topics error for failed messages and retry
topics.
Any suggestions?
Thanks
Hi
I´m trying to configure a multinode cluster in kafka. I have configured
each server.properties according with the new properties for each server.
When i start each server, the zookeeper console shows that error.
INFO Got user-level KeeperException when processing
sessionid:0x161a690f731
r.java:445)
at org.apache.zookeeper.ZooKeeper.(ZooKeeper.java:380)
at org.I0Itec.zkclient.ZkConnection.connect(ZkConnection.java:70)
... 7 more
Thanks
On 18 February 2018 at 02:45, Maria Pilar <pilife...@gmail.com> wrote:
> Hi
>
> I´m trying to configure a multinode cluster in kafka.
;
> On Sat, Feb 17, 2018 at 5:49 PM, Maria Pilar <pilife...@gmail.com> wrote:
>
> > When i try to create a topic in that multicluster,
> >
> > kafka-topics.bat --create --topic my-kafka-topic --zookeeper
> locahost:2181
> > --replication-factor 2
he "l", right?
> This one will usually be resolvable, hence doesn't throw an
> unknownHostException
>
> Regards
>
> Maria Pilar <pilife...@gmail.com> schrieb am So., 18. Feb. 2018, 02:49:
>
> > When i try to create a topic in that multicluster,
>
s in /etc/hosts w.r.t. localhost ?
>
> I wonder if the exception had something to do with ipv6.
>
> On Sat, Feb 17, 2018 at 5:49 PM, Maria Pilar <pilife...@gmail.com> wrote:
>
> > When i try to create a topic in that multicluster,
> >
> > kafka-topics.bat --crea
Hi
I need to create aggretions events and publish them in other topic for a
Stream Kafka API.
I usually i have done aggregates events with Apache Spark, however it
requires include a new bussines layer into our E2e solution.
I have checked the possibility to use aggreate method with KTABLE.
Do
streams/
>
> https://github.com/confluentinc/kafka-streams-examples
>
>
> -Matthias
>
>
> On 2/19/18 1:19 AM, Maria Pilar wrote:
> > Hi
> >
> > I need to create aggretions events and publish them in other topic for a
> > Stream Kafka API.
> >
hi everyone
I´m using mirror maker tool and i´m configuring the consumer.properties and
producer.properties, but i´m not sure which is the place where i need to
put these files.
Because mirrormaker tool is installed into the server with kafka manager,
and they are two brokers configured as well
; /mm_producer.properties
> --whitelist=topic1|topic2
>
> Meeiling
>
> On 3/7/18, 5:11 AM, "Maria Pilar" <pilife...@gmail.com> wrote:
>
> hi everyone
>
> I´m using mirror maker tool and i´m configuring the
> consumer.properties and
> p
18 matches
Mail list logo