Hi,
I do appreciate if someone point me to any java example showing how one
can implement offset commit using Simple Consumer API? I have not found
any !
best,
/Shahab
This thread and the kafka contact page both say to email
users-unsubscr...@kafka.apache.org to unsubscribe. I tried emailing it
last night, about 17 hours ago now, but this morning I am still getting
kafka-users list emails and I have had no notification that my unsubscribe
request has been
Shahab,
So, after a few searches, it just makes sense to paste it here. To Commit,
do something like this:
OffsetCommitRequest request = new OffsetCommitRequest(StringgroupId,
MapTopicAndPartition, OffsetAndMetadata requestInfo, int correlationId,
String clientId, short versionId);
Hi everyone,
we run load tests against our web application (about 50K req/sec) and every
time a kafka broker dies (also controlled shutdown), the producer tries to
connect with the dead broker for about 10-15 minutes. For this time the
application monitoring shows a constant error rate (about of
Hi Alexey,
So, a couple things. Your config seems to have some issues that would
result in long wait times,
You should try this configuration and see if you still have the issue:
acks=1
compression.type=snappy
retries=3 #Retry a few times to make it so they don¹t get dropped when a
broker
SSL is supported for new producer and consumer api and old api (simple consumer
and high-level consumer) is not supported.
I think spark uses simple consumer? if so its not supported.
Thanks,
Harsha
On August 28, 2015 at 11:00:30 AM, Cassa L (lcas...@gmail.com) wrote:
Hi,
I was going through
Hi, folks,
I just want to check if anybody know if there are any plan of Kafka
MirrorMaker support running over SSL. I tried to follow
https://cwiki.apache.org/confluence/display/KAFKA/Deploying+SSL+for+Kafka
to configure consumer/producer properties with SSL but no luck, looks that
Kafka
I can't speak for the Spark Community, but checking their code,
DirectKafkaStream and KafkaRDD use the SimpleConsumer API:
https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala
SSL is also for 0.8.3... if you are on 0.8.2.1, the normal Spark
instructions will work for you.
On Fri, Aug 28, 2015 at 1:32 PM, Cassa L lcas...@gmail.com wrote:
Just to confirm, is this what you are mentioning about? Is there any
example on how to set it? I believe it is for 0.8.3 version?
Just to confirm, is this what you are mentioning about? Is there any
example on how to set it? I believe it is for 0.8.3 version?
https://cwiki.apache.org/confluence/display/KAFKA/Multiple+Listeners+for+Kafka+Brokers
On Fri, Aug 28, 2015 at 12:52 PM, Sriharsha Chintalapani ka...@harsha.io
Hi,
I was going through SSL setup of Kafka.
https://cwiki.apache.org/confluence/display/KAFKA/Deploying+SSL+for+Kafka
However, I am also using Spark-Kafka streaming to read data from Kafka. Is
there a way to activate SSL for spark streaming API or not possible at all?
Thanks,
LCassa
You can configure PLAINTEXT listener as well with the broker and use that port
for spark.
--
Harsha
On August 28, 2015 at 12:24:45 PM, Sourabh Chandak (sourabh3...@gmail.com)
wrote:
Can we use the existing kafka spark streaming jar to connect to a kafka server
running in SSL mode?
We are
Yeah, the direct api uses the simple consumer
On Fri, Aug 28, 2015 at 1:32 PM, Cassa L lcas...@gmail.com wrote:
Hi I am using below Spark jars with Direct Stream API.
spark-streaming-kafka_2.10
When I look at its pom.xml, Kafka libraries that its pulling in is
Can we use the existing kafka spark streaming jar to connect to a kafka
server running in SSL mode?
We are fine with non SSL consumer as our kafka cluster and spark cluster
are in the same network
Thanks,
Sourabh
On Fri, Aug 28, 2015 at 12:03 PM, Gwen Shapira g...@confluent.io wrote:
I can't
I faced the exact same problem recently. The JIRA is filed here:
https://issues.apache.org/jira/browse/KAFKA-2459
Please have reconnect.backoff.ms to be greater than retry.backoff.ms (like
1sec more). I think the metadata expired and when it is trying to fetch the
new metadata for this producer
15 matches
Mail list logo