I can see that in general but why would that limit be several K's ?
The whole setup is designed to measure just that among the rest:
e.g.: how many open files / connections can kafka handle?
limits are far above and the question is do you see any limit on kafka's
side to the number of connections?
If your creating a new producer for each send instead of reusing a pool of
producer connections for each send within your threads that could result in
the brokers surpassing your open file limit.
/***
Joe Stein
Founder, Principal Consultant
Big Data Open Sou
just VPN'ed into my workstation:
the answer to 5 is [*yes*]
answer to 1,2:
the error's I see on the python client are first timeouts and then message
send failures, using sync send.
on the controller log:
ontroller.log.2014-08-26-13:[2014-08-26 13:40:44,317] ERROR
[Controller-1-to-broker-3-send-t
Hi and sorry for the late response I just got into the weekend and still
Satdurday here...
Well, not at my desk but will answer what I can:
1. what else on the logs? [*will vpn and check*]
2. other broker failure reason? [*"*]
3. other broker failure after taking leadership? [*how can I be sure? as
I think it sounds more like another issue than your thinking...the broker
should not be failing like that especially another broker being affected
doesn't make sense.
What else is in the logs on failure? Is the other broker failing because
of number of files too? Is it happening after it becomes
I sure did. the reason I am building is trying to patch some. specifically
this : KAFKA-1623.
actually if I felt more confident about scala, I would happily send you a
patch.
If you don't care screening, just tell me how to prep it for ya and i will.
The bigger problem is running into too many open
Have you tried using a binary release http://kafka.apache.org/downloads.html
this way you don't have to-do a build?
We build using JDK 6 you should be able to run in 8 (I know for sure 6 & 7
work honestly never tried 8).
I just did a quick test with a broker running on 8 and produced/consumed a
f
./gradlew -PscalaVersion=2.9.2 clean jar failed with JDK 8. (error: error
while loading CharSequence, class file
'/usr/java/jdk1.8.0_20/jre/lib/rt.jar(java/lang/CharSequence.class)' is
broken)
I understand there's no escape from installing JDK 7?
10x
Shlomi
On Thu, Sep 4, 2014 at 6:11 PM, Joe Ste
When building you need to use the ./gradelw script as Harsha said. Please
take a look at the README for specific commands and how to run them.
/***
Joe Stein
Founder, Principal Consultant
Big Data Open Source Security LLC
http://www.stealth.ly
Twitter:
it failed with JDK 8 so I hoped a newer gradle will maybe do the magic, and
stepped into this other problem.
I assume you will say :
install JDK 7 and build with our gradle 1.6.
is it so?
Shlomi
On Thu, Sep 4, 2014 at 5:41 PM, Harsha wrote:
> Did you tried "gradlew" script in kafka source dir.
Did you tried "gradlew" script in kafka source dir.
-Harsha
On Thu, Sep 4, 2014, at 07:32 AM, Shlomi Hazan wrote:
> what gradle version is used to build kafka_2.9.2-0.8.1.1 ?
>
> tried with v2 and failed with :
>
>
>
> gradle --stacktrace clean
>
> FAILURE: Build failed with an exception.
>
what gradle version is used to build kafka_2.9.2-0.8.1.1 ?
tried with v2 and failed with :
gradle --stacktrace clean
FAILURE: Build failed with an exception.
* Where:
Build file
'/home/shlomi/0dec0xb/project/vpmb/master/3rdparty/kafka/code/kafka-0.8.1.1-src/build.gradle'
line: 34
* What went
12 matches
Mail list logo