Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/3029
(quickstart worked as well!)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/3029
I've tested the change on a YARN cluster (I know there are tests for that).
I'm currently trying out the quickstarts as well.
---
If your project is set up for it, you can reply to this email
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/3029#discussion_r93206487
--- Diff: pom.xml ---
@@ -91,7 +91,6 @@ under the License.
1C
true
log4j-test.properties
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2767
@StephanEwen I'll start a discussion on the mailing list to decide how we
want to proceed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2767#discussion_r93053073
--- Diff: flink-streaming-connectors/flink-connector-elasticsearch5/pom.xml
---
@@ -0,0 +1,93 @@
+
+
+http://maven.apache.org/POM/4.0.0
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2767#discussion_r93052912
--- Diff: flink-streaming-connectors/flink-connector-elasticsearch5/pom.xml
---
@@ -0,0 +1,93 @@
+
+
+http://maven.apache.org/POM/4.0.0
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2861
@fpompermaier I guess logstash is just a client to ES that implement its
own retry logic (similar to Flink).
I'll check out the JIRA.
---
If your project is set up for it, you can reply
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r93049169
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r93048695
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2861
@fpompermaier Why is throwing exceptions in close causing document losses?
As far as I can see ES is flushing all outstanding batches on close().
---
If your project is set up for it, you can
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r93045771
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r93043759
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2861
I checked the elasticsearch documentation and some user forum from ES, and
indeed it seems that they do not include any retry logic into their clients.
---
If your project is set up for it, you
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/3008
+1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/3006
+1 to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2867
+1 to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2859
+1 to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2890
Final travis build: https://travis-ci.org/rmetzger/flink/builds/182582571
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2890
+1 to merge.
I'll merge it now
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2953
Rebased to master to include fixed build.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2509
Thanks a lot! I have your Kafka pull requests on my todo list. I hope I get
to it soon. I'm really sorry.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2737
I've created a fix:
https://github.com/rmetzger/flink/commit/28a3aed8633246717c52599c25ced928436a6f97
and pushed it to my travis account (Github has a service outage, that's why
travis builds
GitHub user rmetzger opened a pull request:
https://github.com/apache/flink/pull/2953
[FLINK-5039] Bump Avro version
This is a critical issue for some of our users.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rmetzger/flink
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2950
+1 I'll merge it
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90916177
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90916759
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90917049
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90916979
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90917303
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2861#discussion_r90917039
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2893
+1
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2886
I wonder if there is an easy way to add a test case for this?
Maybe you could add a check to one of the HA tests?
---
If your project is set up for it, you can reply to this email and have your
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2899#discussion_r90234726
--- Diff: flink-dist/src/main/flink-bin/conf/log4j.properties ---
@@ -16,14 +16,26 @@
# limitations under the License
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2850
Thank you for the review. I'll merge it once travis is green.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2795
Merging ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2795
Rebased and pushed to travis again:
https://travis-ci.org/rmetzger/flink/builds/178664750
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2664
I had the same thought. We could add the maven assembly plugin / shade
plugin to each connector / library to build a fat jar, and then add some logic
to flink-dist to collect these fat jars
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2664
I thought that transitive dependencies are resolved in the scope of
assembly descriptors. But I'm not so sure about that anymore.
---
If your project is set up for it, you can reply to this email
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2850
Good catch. I removed the reflection magic from the class.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2790#discussion_r89341178
--- Diff:
flink-streaming-connectors/flink-connector-elasticsearch2/src/main/java/org/apache/flink/streaming/connectors/elasticsearch2/ElasticsearchSink.java
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2790
I agree with Fabian here. The ESHelper does not use any Flink code at all,
so the relation to Flink is not clear. A user of Hadoop would equally benefit
from such a utility. I would expect that ES
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2687
I'll review this PR once https://github.com/apache/flink/pull/2509 has been
merged.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89275454
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.9/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTestEnvironmentImpl.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89283859
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.9/src/main/java/org/apache/flink/streaming/connectors/kafka/internal/Kafka09Fetcher.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89283319
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.8/src/main/java/org/apache/flink/streaming/connectors/kafka/internals/SimpleConsumerThread.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89284541
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.9/src/main/java/org/apache/flink/streaming/connectors/kafka/internal/Kafka09Fetcher.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89276062
--- Diff:
flink-streaming-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaConsumerTestBase.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89275211
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.8/src/main/java/org/apache/flink/streaming/connectors/kafka/internals/SimpleConsumerThread.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89282863
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.8/src/main/java/org/apache/flink/streaming/connectors/kafka/internals/Kafka08Fetcher.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89150327
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.10/src/test/java/org/apache/flink/streaming/connectors/kafka/Kafka010FetcherTest.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89275431
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.9/src/test/java/org/apache/flink/streaming/connectors/kafka/Kafka09ITCase.java
---
@@ -110,6
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89275697
--- Diff:
flink-streaming-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaConsumerTestBase.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89284524
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.9/src/main/java/org/apache/flink/streaming/connectors/kafka/internal/Kafka09Fetcher.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89274682
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.10/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTestEnvironmentImpl.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89276937
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.10/src/main/java/org/apache/flink/streaming/connectors/kafka/internal/Kafka010Fetcher.java
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2509#discussion_r89274555
--- Diff:
flink-streaming-connectors/flink-connector-kafka-0.10/src/test/java/org/apache/flink/streaming/connectors/kafka/Kafka010ITCase.java
---
@@ -131,6
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2850
I haven't tried it yet.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2664
Thank you for fixing the issue so quickly.
I'm wondering whether the current approach is a good idea, because it
requires manual checking of all transitive dependencies. We have something
GitHub user rmetzger opened a pull request:
https://github.com/apache/flink/pull/2850
[FLINK-4895] Drop Hadoop1 support
I've removed all the infrastructure in Maven and the `tools/` directory to
get rid of Hadoop1.
I didn't test the update to the release script. So
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2681
Cool, thx.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2664
I tried out the change and I like the idea.
One issue I found is that transitive dependencies are not properly added:
Kafka 0.10 depends on the Kafka 0.9 code, but that one (and its dependencies
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2681
Did you refactor the table tests? If not, we can also merge this PR as is
and file a follow up JIRA.
One other thing I thought about while looking over the PR: We need the
partition list only
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2787
I agree that we should try to get this into the RC.
+1 to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2724
I'll merge the change ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user rmetzger opened a pull request:
https://github.com/apache/flink/pull/2755
[hotfix][docs] Stream joins don't support tuple position keys
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rmetzger/flink
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2724
Thank you. I'll wait for Greg.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2724#discussion_r86549923
--- Diff:
flink-runtime-web/web-dashboard/app/partials/jobs/job.plan.node-list.metrics.jade
---
@@ -0,0 +1,47 @@
+//
+ Licensed to the Apache
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2724#discussion_r86548295
--- Diff: flink-runtime-web/web-dashboard/app/partials/jobs/job.jade ---
@@ -50,19 +50,15 @@
nav.navbar.navbar-default.navbar-fixed-top.navbar-main(ng
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2724
Thank you for the review. I'm going to address your inline comments.
However, if you are okay with it, I would like to address your general
comments with the next web interface pull request
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2683#discussion_r86119642
--- Diff:
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
---
@@ -1828,6 +1828,33 @@ class JobManager
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2706
Thank you for fixing this.
I'll merge the change.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2724
Thank you for the review @zentol. I addressed your comments and rebased to
master again.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
GitHub user rmetzger opened a pull request:
https://github.com/apache/flink/pull/2724
[FLINK-4221] Show metrics in WebFrontend + general improvements
Other included changes:
- Removed Properties tab
- Renamed plan to overview
- Added parallelism to task list
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2616
The `CoordinatorShutdownTest` fixes look reasonable.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2616
I tested the change locally, it works. +1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2616
Thank you for rebasing.
This run:
had the following error:
https://s3.amazonaws.com/archive.travis-ci.org/jobs/170428661/log.txt
```
Failed tests
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2694
+1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user rmetzger opened a pull request:
https://github.com/apache/flink/pull/2705
[FLINK-2597][FLINK-4050] Add wrappers for Kafka serializers, test for
partitioner and documentation
This pull requests addresses the following JIRAs:
- [FLINK-2597
Add a test for Avro
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2644
Thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2673
Thank you for looking into the details of this. If the user who ran into
the issue is using Java, they can fix the issue themselves by relocating our or
their Calcite version locally
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2616#discussion_r84696513
--- Diff:
flink-runtime-web/src/main/java/org/apache/flink/runtime/webmonitor/handlers/JobDetailsHandler.java
---
@@ -147,11 +143,36 @@ public String
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2616#discussion_r84697528
--- Diff:
flink-runtime-web/src/main/java/org/apache/flink/runtime/webmonitor/handlers/JobVertexDetailsHandler.java
---
@@ -99,11 +83,34 @@ public String
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2626
Looks like the tests are failing.
I quickly scrolled over the changes. +1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2654
Nice fix, thank you
+1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2680
Thank you for opening a PR for fixing this.
+1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2683
Thank you for opening a pull request. I agree that we should expose all
numbers we show in the web interface as a metric as well.
---
If your project is set up for it, you can reply to this email
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2683#discussion_r84689406
--- Diff:
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
---
@@ -1828,6 +1828,33 @@ class JobManager
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2487
Hi @haoch, thanks alot for this contribution.
I recently started moving some of the streaming connectors of Flink to
Apache Bahir, a community for extensions to Spark, Flink (and maybe
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2673
I tried reproducing the issue reported by @fhueske. The only issue I got
was `Caused by: org.codehaus.commons.compiler.CompileException: Line 8, Column
13: Class
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2448
Okay, I'll run gulp when merging.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2644
Thank you for opening a pull request for adding a new streaming connector.
I also think that this connector is a good candidate to go into Apache
Bahir. You find the GitHub repository of Bahir
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2676
+1 to merge this change.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2614
I think this is a good refactoring.
+1 to merge.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2615
Storing numbers as strings is not terribly efficient, but since this only
affects the web frontend, its okay.
+1 to merge.
---
If your project is set up for it, you can reply
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2448#discussion_r84459904
--- Diff:
flink-runtime-web/src/main/java/org/apache/flink/runtime/webmonitor/handlers/JsonGenerators.java
---
@@ -0,0 +1,86 @@
+/*
+ * Licensed
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2678
Thank you for opening a pull request.
@fhueske @twalthr @uce I think its okay to make the class public, in case
users want to use a custom Kafka consumer, right?
---
If your project
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2677
+1 change is good to merge.
The user who reported the issue tested the fix.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2636
Thank you
The change is good to merge!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rmetzger commented on the issue:
https://github.com/apache/flink/pull/2636
I tested the change this morning, and its not working:
```
2016-10-20 11:08:28,216 WARN
org.apache.flink.runtime.metrics.MetricRegistry - Could not start
Metr
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/flink/pull/2636#discussion_r84115204
--- Diff:
flink-runtime-web/src/main/java/org/apache/flink/runtime/webmonitor/metrics/MetricFetcher.java
---
@@ -148,7 +148,7 @@ public void onSuccess
301 - 400 of 2231 matches
Mail list logo