can not get metadata when one of brokers is down

2018-03-12 Thread 许志峰
Hi all, I have a Kafka cluster with 3 nodes, props.put("bootstrap.servers", "node1:9092, node2:9092, node3:9092") as I known, when a producer is created by new KafkaProducer(props) and wants to send a message, it will first choose a broker in the list [node1,node2,node3], connect with it and

kafka producer send failed

2018-03-12 Thread 杰 杨
Hi: I used producer sending data to kafka. due to network and other problem makes send failed . I register callback on send data. so I test 300W data in kafka,but only 200W data in kafka. and 100W data in local files . I did config retries 10 and acks is 'all' in producer. it seems kafka

exactly once and storing offsets in db (transactionally with computation results)

2018-03-12 Thread Marasoiu, Nicu
Hi, We would consider one of 2 or 3 flows to ensure an "exactly once" process from an input kafka topic to a database storing results (using kafka consumer, but also evaluated kafka streams and details at the end) and wanted to gather your input on them: (for simplicity let's assume that any

Re: Using Kafka to build a streaming platform for a research facility?

2018-03-12 Thread Berryman, Eric
The European Spallation Source [1] seems to be using it for this case [2]. I am also using this code [2], but only for visualization in another "data center". [1] https://europeanspallationsource.se/ [2] https://github.com/ess-dmsc/forward-epics-to-kafka Thank you! Eric

Kafka 0.9 MirrorMaker failing with Batch Expired when producing to Kafka 1.0 cluster

2018-03-12 Thread Andrew Otto
Hi all, I’m troubleshooting a MirrorMaker issue, and am not quite sure yet why this is happening, so I’d thought I’d ask here in case anyone else has seen this before. We’ve been running a Kafka 1.0 cluster for a few months now, replicating data from a Kafka 0.9.0.1 cluster using 0.9.0.1

Using Kafka to build a streaming platform for a research facility?

2018-03-12 Thread Hannes Petri
Hi, I work at a research facility where numerous hi-res detectors produce thousands of GB of data every day. We want to build a highly flexible and performant streaming platform for storing, transmitting and routing the data. For example, detector output needs to end up: 1. In permanent

Solution for clients with long-lived sustained SSL connections using JKS

2018-03-12 Thread Alexander Maniates
Our set up: Brokers on 0.10.1 Clients on 0.9 -On startup, clients are dynamically issued a signed certificate that is vaild for 48 hours. A JKS is created using this certificate. -All brokers have a signed certificate in their JKS that is valid for some years. The issue: Clients only load

Re: Using Kafka to build a streaming platform for a research facility?

2018-03-12 Thread Afonso Mukai
Hi Hannes, We will use Kafka here at the European Spallation Source (the facility is currently under construction) to stream data from neutron detectors and other experimental station equipment to consumers (the EPICS forwarding software mentioned by Eric covers the sources other than

transactional behavior offsets+effects

2018-03-12 Thread Marasoiu, Nicu
Hi, We would consider one of 2 or 3 flows to ensure an "exactly once" process from an input kafka topic to a database storing results (using kafka consumer, but also evaluated kafka streams and details at the end) and wanted to gather your input on them: (for simplicity let's assume that any

KSQL Question about detecting events in a given timeframe.

2018-03-12 Thread Richard L. Burton III
So what I'm looking to do is support the ability to detect multiple events that happen in a given window. e.g., System Starts, Pauses and then stops within 24 hours. The event is relatively straightforward: Event Properties: System ID (GUID) Signal: (start, pause, stop) Occurred At:

RE: transactional behavior offsets+effects

2018-03-12 Thread Marasoiu, Nicu
Hi, Indeed, solution 2 seems feasible using db transaction (e.g. Cassandra batch) to include an offset update. A sophisticated implementation is for instance under the hood of

Re: transactional behavior offsets+effects

2018-03-12 Thread Guozhang Wang
Hi Nicu, What you described sounds reasonable to me. In fact, solution 1 would not perfectly work if you have a failure on your db right after step 5 but before step 6, so to make the txn commit in Kafka and the txn commit in your sink DB "an atomic operation" together, you need to either encode

Re: Using Kafka to build a streaming platform for a research facility?

2018-03-12 Thread Guozhang Wang
Nice paper to read and cool usage of Kafka. Thanks for sharing Afonso :) Guozhang On Mon, Mar 12, 2018 at 1:13 PM, Afonso Mukai wrote: > Hi Hannes, > > We will use Kafka here at the European Spallation Source (the facility is > currently under construction) to stream