Chandra,
If you're just serving files from IIS and want to collect logs, you'll
probably want to run a separate application to collect the log files and
report each log entry to Kafka.
If you're running a web application, you can use the producer yourself to
report events to Kafka.
-Ewen
On
Thanks for the input Ismael.
I will try and let you know.
Also need your valuable inputs for the below issue:)
i am not able to run kafka-topics.sh(0.9.0.0 version)
[root@localhost bin]# ./kafka-topics.sh --list --zookeeper localhost:2181
[2015-12-28 12:41:32,589] WARN SASL configuration
Hi Oliver,
You need gradle 2.x to build Kafka.
Ismael
On 27 Dec 2015 18:10, "Oliver Pačut" wrote:
I have Gradle 1.4, Groovy 1.8.6, Ant 1.9.3.
I followed the included Readme. I cd'd to the kafka directory and just
wrote "gradle".
I also tried running
Hi Prabhu,
kafka-console-consumer.sh uses the old consumer by default, but only the
new consumer supports security. Use --new-consumer to change this.
Hope this helps.
Ismael
On 28 Dec 2015 05:48, "prabhu v" wrote:
> Hi Experts,
>
> I am getting the below error when
Hi,
Correct me if I am wrong. I believe kafka does not replicate the data of a
lost replica if a broker hosting the replica goes down.
What should be done to make to the replication factor reach the desired
level if the broker hosting the replica cannot be brought back?
Regards,
Manju
Manju,
You understanding is correct. Just added the following FAQ.
https://cwiki.apache.org/confluence/display/KAFKA/FAQ#FAQ-Howtoreplaceafailedbroker
?
Thanks,
Jun
On Sun, Dec 27, 2015 at 11:30 PM, Manjunath Shivanna
wrote:
> Hi,
>
> Correct me if I am wrong. I
Although, also note that this is only necessary if you want to exactly
mirror the Java client implementation. The consumer protocol can be
implemented however you like in your library (although obviously the Java
implementation is a good reference). The only reason you'd need to match it
exactly