Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/335#issuecomment-71492121
Sure, XyzTest are unit tests which are executed in Maven's test phase.
These should execute rather fast. Everything that brings up a full Flink system
is an integration
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/264#issuecomment-71455623
Updated the PR and will merge once Travis completed the build.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/265#issuecomment-72886847
Other comments on the API breaking.
If not, I'd merge it...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/202#issuecomment-72640916
Just implemented the basic triangle enumeration job and figured out that
this example is already included in this PR ;-)
However, when trying to run both programs, I
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/202#issuecomment-72641447
btw. implementing the program felt quite good. Very nice API, IMO!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/363#discussion_r24117856
--- Diff: docs/hadoop_compatibility.md ---
@@ -52,56 +63,70 @@ Add the following dependency to your `pom.xml` to use
the Hadoop Compatibility L
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/363#discussion_r24117603
--- Diff: docs/hadoop_compatibility.md ---
@@ -38,9 +39,19 @@ This document shows how to use existing Hadoop MapReduce
code with Flink. Please
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/363#discussion_r24118164
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/ExecutionEnvironment.java ---
@@ -458,6 +461,67 @@ public CsvReader readCsvFile(String filePath
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/363#issuecomment-72939364
Looks good.
Besides the typos and inline comments, you could also move the
`HadoopInputFormatTest` and the `HadoopIOFormatsITCase` to flink-java and
flink-tests
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/202#issuecomment-72547127
Asking others to implement the standard example programs has worked quite
well to identify issues with new APIs. How about, we look for people who try
out the API
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/338#issuecomment-72561341
+1
@rmetzger Can you create a JIRA for this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/347#discussion_r23914766
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/operators/DataSink.java ---
@@ -83,6 +93,107 @@ public DataSink(DataSetT data, OutputFormatT
Github user fhueske closed the pull request at:
https://github.com/apache/flink/pull/316
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/265#issuecomment-72613154
The API was just extended, but the parsing logic for strings changed (see
PR description). So programs that relied on the previous way of parsing will
fail now
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/360
[FLINK1443] Add support for replicating input formats.
InputFormats can be wrapped by a ReplicatingInputFormat which takes care
that the full input of the wrapped input format is read in each
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/350#discussion_r24167714
--- Diff: docs/programming_guide.md ---
@@ -2398,6 +2399,61 @@ of a function, or use the `withParameters(...)`
method to pass in a configuratio
[Back
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/350#discussion_r24168441
--- Diff: docs/programming_guide.md ---
@@ -2398,6 +2399,61 @@ of a function, or use the `withParameters(...)`
method to pass in a configuratio
[Back
Github user fhueske closed the pull request at:
https://github.com/apache/flink/pull/360
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/360#issuecomment-73025884
Merged as a19b4a02bfa5237e0dcd2b264da36229546f23c0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/369#issuecomment-73251849
Definitely!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/369#issuecomment-73251309
Thanks for the fix!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/372#issuecomment-73387496
I think it would be nice to have some kind of hierarchical structure of the
output such as:
`$sinkName:$taskId $outputValue`
That would give the name of the sink
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/372#issuecomment-73417227
Sounds good to me!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73388209
Looks good in general.
You need to make sure though, that you obey the execution object re-usage
settings.
That basically means you need to pay attention
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/265#issuecomment-72501228
I added documentation. Any objections against merging this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/347#issuecomment-72501446
@rmetzger thanks for the feedback. I addressed your comment.
Plan to merge this tomorrow, unless somebody objects.
---
If your project is set up for it, you can reply
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/355#issuecomment-72503732
+1 for merging.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/287#issuecomment-70112509
LGTM
Will do some tests, clean-up and merge if everything is fine
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/311#issuecomment-70517390
IMO, users should only be allowed to set semantic properties though field
expression strings. There should be no need to implement an own
SemanticProperty class
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/314#issuecomment-70519951
Good to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/311#discussion_r23070484
--- Diff:
flink-compiler/src/main/java/org/apache/flink/compiler/dag/BinaryUnionNode.java
---
@@ -266,4 +268,44 @@ public void computeOutputEstimates
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/311#discussion_r23073054
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/typeutils/PojoTypeInfo.java
---
@@ -45,6 +46,16 @@
*/
public class PojoTypeInfoT
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/311#issuecomment-70236149
Thanks for the review!
Proposed names for constant field semantic properties:
* constant fields (current)
* unmodified fields
* forwarded fields
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/316
[FLINK-1369] [types] Add support for Subclasses, Interfaces, Abstract
Classes
This PR rebased PR #236 to the current master.
Some tests were failing and I had a closer look. The original PR
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73914381
Oh, just saw that you updated your PR.
Won't open a PR. You can have a look at my branch here:
https://github.com/fhueske/flink/tree/chained_all_reduce
---
If your
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/381#issuecomment-74908224
any further comment on this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/411
[FLINK-1466] Add HCatInputFormats to read from HCatalog tables.
Right now the Flink-tuple mode is restricted to primitive types (no ARRAY,
STRUCT, and MAP types) and the max Java/Scala tuple width
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/411#issuecomment-74894556
@rmetzger Thanks for the feedback.
Added support for complex types to the Flink tuple mode and tested it on a
local cluster setup.
---
If your project is set up
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/343#issuecomment-71760881
+1 good to merge
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/347
[FLINK-1105] Add support for locally sorted output
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/fhueske/flink locallySortedOutput
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/337#issuecomment-71378388
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/236#issuecomment-71436626
@aljoscha Have a look at #316 where I took this PR, rebased it, and fixed
some problems with Pojo types.
---
If your project is set up for it, you can reply
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/301#issuecomment-71440329
No worries ;-) I understand that discussing such a trivial feature feels
like a waste of time. Unfortunately these are the features that are easy to
comment
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/301#issuecomment-71435854
Well, you would have saved everybody's time of you had made this
requirements clear from the beginning. Besides your first two versions didn't
comply with these new
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/301#issuecomment-71433679
Hmmm, using the String pattern seems to be much more comfortable for users,
no?
If a user wants to have the data written out with some kind of filename
pattern
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/264#issuecomment-71285644
That's a good point. I copied the code from the DelimitedInputFormat which
allows to specify the charset for the record delimiter.
So if we go with
1. we should
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/335#issuecomment-72359922
One more thing ;-)
Did we collect ICLAs from all people contributing significant parts to
Gelly?
---
If your project is set up for it, you can reply to this email
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/379
[FLINK-1444][api-extending] Add support for split data properties on data
sources
This pull request adds support for declaring global and local properties
for input splits.
You can merge this pull
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/375#discussion_r24327047
--- Diff:
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/ExecutionJobVertex.java
---
@@ -260,15 +260,49 @@ public void
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/375#issuecomment-73505358
Only minor remarks.
Looks good otherwise.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/363#issuecomment-73022621
Hmm, yes. That's also a valid point.
But on the other hand, new users might not even be aware of the different
types of InputFormats. It all would look natural
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/503#issuecomment-84382628
Exactly, thanks @hsaputra
It would also be good to add some actual arguments to the discussion that
go beyond I like xxx more than yyy ;-)
---
If your project
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/476#issuecomment-78306554
I updated the PR and made the preference choice a bit more lightweight.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/478#issuecomment-78308404
Thanks for the fix!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26215940
--- Diff:
flink-java/src/test/java/org/apache/flink/api/java/io/CsvInputFormatTest.java
---
@@ -684,4 +693,178 @@ private void testRemovingTrailingCR(String
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/410#issuecomment-77569539
I think it would be definitely good to have something like a job submission
queue, that accepts jobs and executes them as soon as enough as enough
resource become
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/458
[FLINK-1628] Fix partitioning properties for Joins and CoGroups.
Fix partitioning properties for Joins and CoGroups and some smaller bugs on
the way.
You can merge this pull request into a Git
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/466#issuecomment-78084575
Yes, I've got a couple of comments as well.
First of all, as @mxm said, I would propose to call this operator
``combine`` because it is a generalized combiner
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26203612
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -64,26 +70,45 @@
private transient int commentCount
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26205234
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -152,6 +177,38 @@ public void setFields(boolean[] sourceFieldMask
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26206307
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -234,9 +291,29 @@ public OUT readRecord(OUT reuse, byte[] bytes
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26206381
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -234,9 +291,29 @@ public OUT readRecord(OUT reuse, byte[] bytes
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26205969
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -152,6 +177,38 @@ public void setFields(boolean[] sourceFieldMask
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26206133
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -152,6 +177,38 @@ public void setFields(boolean[] sourceFieldMask
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26206329
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -234,9 +291,29 @@ public OUT readRecord(OUT reuse, byte[] bytes
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/482#issuecomment-78868258
Its true, the parameters are not necessarily needed, but they don't harm
either. In fact, I like it to explicitly specify parameters. I do not see a
need to break the API
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/476
[FLINK-1683] [jobmanager]Â Fix scheduling preference choice for non-unary
execution tasks
Fix validated on a cluster setup.
You can merge this pull request into a Git repository by running
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/466#discussion_r26135427
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/operators/GroupReducePartialOperator.java
---
@@ -0,0 +1,229 @@
+/*
+ * Licensed
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018880
--- Diff:
flink-scala/src/main/scala/org/apache/flink/api/scala/ExecutionEnvironment.scala
---
@@ -223,8 +224,11 @@ class ExecutionEnvironment(javaEnv
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/525#issuecomment-85499509
Yes, I agree. The fix is more conservative than necessary.
I think we can safely relax it if we make the rule for forwarded fields on
group-wise operators as follows
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018277
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -52,103 +50,86 @@
public static final String
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018316
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -52,103 +50,86 @@
public static final String
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-85455383
Very good! Let me know, when you want me to have a look again :-)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018151
--- Diff:
flink-scala/src/main/java/org/apache/flink/api/scala/operators/ScalaCsvInputFormat.java
---
@@ -98,98 +123,66 @@ public void setFields(int
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-85447697
I see the issue with the non-deterministic field order and FLINK-1665 as
follows. Both, FLINK-1665 and Option 3, solve the problem of non-deterministic
field order
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018812
--- Diff:
flink-scala/src/main/java/org/apache/flink/api/scala/operators/ScalaCsvInputFormat.java
---
@@ -19,66 +19,91 @@
package
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018473
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -52,103 +50,86 @@
public static final String
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r27018597
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -52,103 +50,86 @@
public static final String
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/532
[FLINK-1776] Add offsets to field indexes of semantic properties for
operators with key selectors
You can merge this pull request into a Git repository by running:
$ git pull https
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/525#issuecomment-85531186
Do you think this rule is easy enough for users?
It would make the handling consistent for all group-wise operators.
---
If your project is set up for it, you can
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-85866221
Cool, thanks. Will have a look shortly.
Did you rebase to the latest master? We had a few build issues with the
master branch a few days ago
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-86011531
Sure, no problem :-)
Can I check it now or do you need a bit more time?
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-86079595
@chiwanpark excellent job, thanks!
Will merge it after a final round of Travis tests passed.
---
If your project is set up for it, you can reply to this email
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/491#issuecomment-86199581
I had a look at the Travis logs and it seems that the build failures are
not related to your change.
We had a few issues with build stability on the master branch
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/525#issuecomment-86400852
I updated the PR as discussed:
- GlobalProperties are filtered with the user-specified semantic properties.
- LocalProperties for are filtered with forward field
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-84977748
Hi @chiwanpark
Thanks for updating the PR! :-)
I was gone for a few days. Will have a look at your PR shortly.
---
If your project is set up for it, you can
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/453#issuecomment-77254599
Thanks @uce and @hsaputra! I addressed your comments and rebased.
Will merge tomorrow if nobody raises a flag.
---
If your project is set up for it, you can reply
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/426#issuecomment-77272091
How are fields in the CSV file mapped to POJO fields? I assume it is the
order of fields in the POJO type information, right? Is that order the same as
in the POJO
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r25846058
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -64,15 +66,25 @@
private transient int commentCount
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/421#issuecomment-75564012
Thanks for the detailed response.
I am not sure how helpful it is to show three random TMs (incl. a shuffling
button to show other random ones). I think
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/421#issuecomment-75578714
I am not sure about that. Would you like to scroll through say 100 detailed
charts where only 5 fit on a screen to check whether there is one ore more
misbehaving nodes
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/429#issuecomment-75310637
Btw, you can update a PR by pushing into your remote branch and don't need
to close and open a new PR. If you want to change previous commits (including
commit message
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/540#discussion_r27329656
--- Diff:
flink-clients/src/main/java/org/apache/flink/client/program/Client.java ---
@@ -68,7 +68,7 @@
private final Optimizer compiler
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/541
[FLINK-1664] Adds checks if selected sort key is sortable
- Adds checks if a sort key can be actually sorted.
- The POJO type is defined as non-sortable, because an order would depend
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/530#issuecomment-85688151
@smarthi Thanks for the PR!
LGTM, except for a few lines where I spotted space indentions that do not
comply with our code style.
---
If your project is set up
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/350#issuecomment-72098340
Very good addition to the documentation!
I also agree with Alex (see FLINK-1422) to describe how functions can be
configured via the constructor, i.e., that function
GitHub user fhueske opened a pull request:
https://github.com/apache/flink/pull/525
[FLINK-1656] Filter ForwardedField properties for group-at-a-time operators
in Optimizer
Restricts forward field information for group-wise operators.
- For `GroupReduce`, `GroupCombine
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26991431
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/io/CsvInputFormat.java ---
@@ -234,9 +301,23 @@ public OUT readRecord(OUT reuse, byte[] bytes
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26992871
--- Diff:
flink-java/src/test/java/org/apache/flink/api/java/io/CsvInputFormatTest.java
---
@@ -684,4 +693,249 @@ private void testRemovingTrailingCR(String
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/426#discussion_r26993897
--- Diff:
flink-scala/src/main/scala/org/apache/flink/api/scala/ExecutionEnvironment.scala
---
@@ -247,16 +252,27 @@ class ExecutionEnvironment(javaEnv
1 - 100 of 6115 matches
Mail list logo