Github user shijinkui closed the pull request at:
https://github.com/apache/flink/pull/2876
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3211
cc @rmetzger
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3257
[FLINK-5705] [WebMonitor] webmonitor request/response use UTF-8 expliâ¦
QueryStringDecoder and HttpPostRequestDecoder use UTF-8 defined in flink.
Response set content-encoding header with utf
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3190
[FLINK-5546][build] When multiple users run test, /tmp/cacheFile confâ¦
Unit test will create file in java.io.tmpdir, default it's `/tmp` which has
capacity limit.
Create temporary
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
> I would actually prefer to fix the tests, rather than re-assignign the
temp directory. All tests should use a random subdirectory in the temp
directory. It is quite convenient to do via JU
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
@StephanEwen I have use TemporaryFolder to replace creating File manually.
There are some tips:
1. TemporaryFolder should invoke `create()` in setup manually
2. in `shutdown()` should
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
@greghogan That make sense. Let JUnit to create root dir and delete
temporary dir recursively.
---
If your project is set up for it, you can reply to this email and have your
reply appear
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3211
c[FLINK-5640][test]onfigure the explicit Unit Test file suffix
There are four types of Unit Test file: *ITCase.java, *Test.java,
*ITSuite.scala, *Suite.scala
File name ending with "IT
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
" /tmp/cacheFile (Permission denied)" exception of unit test can be replay.
1. on linux env, `sudo - userA`, clone the flink code, start to `mvn clean
test verify`
2. on
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
> The ´${project.build.directory}` is not automatically cleaned up.
The/tmp` directory is natural to clean up.
hi, Stephan, thank for your reply.
When restarting os, tmp
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
@wuchong @StephanEwen can you review this pull request.
Generating independent example runnable jar will increase the size of flink
distribution.
This table example jar is 13Mb or so
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
Get it. Run single test, having no temp created, it should use the default
java.io.tmpdir property.
Let me check that.
In the base test class have a double check about the target
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
hi, @StephanEwen I have re-submit this pull request base on current master
branch which had merged FLINK-5817.
---
If your project is set up for it, you can reply to this email and have your
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
> Can we just use the ${project.build.directory} as java.io.tmpdir ?
@wenlong88 Sorry for late reply.
It's good question. If use `${project.build.directory}` without sub
directory `
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
@StephanEwen The FileCacheDeleteValidationTest had been fixed in
FLINK-5817. This PR have rollback it.
---
If your project is set up for it, you can reply to this email and have your
reply appear
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3319
[FLINK-5806] TaskExecutionState toString format have wrong key
The key of jobID should be executionId in the string format.
- [X] General
- The pull request references the related
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3290#discussion_r100516387
--- Diff:
flink-runtime/src/main/java/org/apache/flink/runtime/jobmaster/JobManagerServices.java
---
@@ -116,12 +116,17 @@ public static
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
> The repackaging is good.
> Unfortunately this breaks the Table API tests, because they use one of
the examples.
>
> The Table API IT cases probably need to use their
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3190
> Here is a related issue: https://issues.apache.org/jira/browse/FLINK-5817
I sounds good. I want to change the default system property
`java.io.tmpdir` to be in the `target` direct
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3211
@StephanEwen Thank for your quickly reviewing.
>`**/*Test.*`
This can clearly show what unit test's file name like, just like
`integration-tests` does. If omitting the `include`, it me
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3211
@StephanEwen have finished that.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3049
OK
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3132#discussion_r96585227
--- Diff:
flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
---
@@ -313,7 +313,6 @@ under the License
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3132
> @rmetzger Any reservations that the upgrade to the Apache Parent POM v18
has implications on release scripts, etc?
@StephanEwen There's no effect to the create_release_files.sh in
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3144
[FLINK-5543][Client] customCommandLine tips in CliFrontend
Only some code annotation in the CliFrontend.
You can merge this pull request into a Git repository by running:
$ git pull https
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3132
[FLINK-5519] [build] scala-maven-plugin version all change to 3.2.2
1. scala-maven-plugin version all change to 3.2.2 in all module
2. parent pom version change to apache-18 from apache-14
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2459
@chiwanpark @StephanEwen thanks for the reply. dependency's version which
contain scala 2.10 replace as property?
---
If your project is set up for it, you can reply to this email and have your
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2459
[FLINK-4561] replace all the scala version as a `scala.binary.version`
property
Replace all the scala version(2.10) as a property `scala.binary.version`
defined in root pom properties. default
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2458
[FLINK-4560] enforcer java version as 1.7
[FLINK-4560](https://issues.apache.org/jira/browse/FLINK-4560) enforcer
java version as 1.7
1. maven-enforcer-plugin add java version enforce
2
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2460
[FLINK-4562] table examples make an divided module in flink-examples
only move table examle code to a divided module in flink-examples.
Table API module should't contain example code
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2459
@StephanEwen I can accept "don't change it unless it is broken".
I explain why I made such a insipid and too much changes PR. I have such
habit, that is the code need
Github user shijinkui closed the pull request at:
https://github.com/apache/flink/pull/2459
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2459
@StephanEwen Thank for your reply, understand it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2458
@StephanEwen OK.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2460#discussion_r80955533
--- Diff: flink-examples/flink-examples-table/pom.xml ---
@@ -0,0 +1,24 @@
+
+http://maven.apache.org/POM/4.0.0;
+xmlns:xsi="
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
@wuchong you are right. Different package of scala/java can avoid same
class conflict
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2541#discussion_r80963928
--- Diff:
flink-streaming-scala/src/main/scala/org/apache/flink/streaming/api/scala/StreamExecutionEnvironment.scala
---
@@ -124,7 +125,7 @@ class
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2541#discussion_r80966754
--- Diff:
flink-streaming-scala/src/main/scala/org/apache/flink/streaming/api/scala/StreamExecutionEnvironment.scala
---
@@ -124,7 +125,7 @@ class
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2460#discussion_r80955413
--- Diff:
flink-examples/flink-examples-table/src/main/scala/org/apache/flink/table/examples/StreamSQLExample.scala
---
@@ -15,7 +15,7 @@
* See
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2458
@greghogan exactly 3.0.3 will be OK. This limit is some module require, I
forget where it is, I am sorry..
---
If your project is set up for it, you can reply to this email and have your
reply
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2541
[FLINK-4669] scala api createLocalEnvironment() function add default
Configuration parameter
1. add Configuration as createLocalEnvironment second default paramter
2. fix the mistake scaladoc
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2458
enforce java 7 before compile. There is no conflict between source/target
and enforcer java version.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2459
hi, @StephanEwen I have an idea in mind, If there are to much tricky bugs,
and follow "don't change it unless it is broken", after two years can you image
the code like? Full of trick
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2458
hi, @StephanEwen
There are much difference between JDK 6 and JDK 7+. It's better for
compatible low version JDKï¼but there are better performance for Scala with
Lambda support, and JDK 7
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2429
[FLINK-4519] scala maxLineLength increased to 120
Because Scala function's parameter is more long than Java code, We can set
the maxLineLength to 120.
The pull request references
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2428
[FLINK-4517] scala code refactoring
the refacter type:
1. case class does't need `new`
2. case block does't need `{ ... }`
3. match-case instead of isInstanceOf
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2429
@rehevkor5
https://builds.apache.org/job/flink-github-ci/274/org.apache.flink$flink-connector-kafka-0.8_2.10/testReport/org.apache.flink.streaming.connectors.kafka/Kafka08ITCase
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2428
@StephanEwen Thank for your reply.
These code change such as case class usage are not forced to observe. But
there are scala best practice. We can update every kind of problem, at same
time
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2429
@StephanEwen Thanks for your reply.
That's OK. We can discuss first
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user shijinkui closed the pull request at:
https://github.com/apache/flink/pull/2429
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2541#discussion_r81031381
--- Diff:
flink-streaming-scala/src/main/scala/org/apache/flink/streaming/api/scala/StreamExecutionEnvironment.scala
---
@@ -124,7 +125,7 @@ class
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
fixed it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user shijinkui closed the pull request at:
https://github.com/apache/flink/pull/2428
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2875
[FLINK-5168] Scaladoc annotation link use [[]] instead of {@link}
`{@link StreamExecutionEnvironment#readFile(FileInputFormat, * String,
FileProcessingMode, long
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2874
[FLINK-5167] StreamExecutionEnvironment set function return `this` inâ¦
StreamExecutionEnvironment's set function return `this` instead of void
for example :
public void
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/2876
[FLINK-5166] TextInputFormatTest.testNestedFileRead
- [x] General
- The pull request references the related JIRA issue ("[FLINK-XXX] Jira
title text")
- The pu
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2874
> Can you add them to this list here? That is where we collect all
API-.breaking changes that we want to add.
Done.
---
If your project is set up for it, you can reply to this em
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2876
> The test should be changed to use more specific folder names including a
random component. If the directory we are trying to create already exists we
should not delete everything there (since
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2541
> In general, I think it is good to not have too many method options, but
only those that add significant functionality. For the method that starts the
WebUI, that is the case, for the met
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2541
@StephanEwen review it again please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2541
@StephanEwen have any improvement needed?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2875#discussion_r89750780
--- Diff:
flink-libraries/flink-gelly-scala/src/main/scala/org/apache/flink/graph/scala/Graph.scala
---
@@ -294,7 +294,7 @@ object Graph
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2541
@StephanEwen Iâve a new function name `createCustomLocalEnv`. Sorry for
late ack.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3049#discussion_r95276351
--- Diff: tools/create_release_files.sh ---
@@ -66,16 +66,19 @@ fi
GPG_PASSPHRASE=${GPG_PASSPHRASE:-XXX}
GPG_KEY=${GPG_KEY:-XXX}
GIT_AUTHOR
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3049
> Before merging this, we would definitely have to run the script as well.
I did not do this and hence I would wait with merging this.
@uce @rmetzger I use this script build local distribut
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/2876#discussion_r90246314
--- Diff:
flink-java/src/test/java/org/apache/flink/api/java/io/TextInputFormatTest.java
---
@@ -90,7 +91,9 @@ public void testSimpleRead
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3051
Flink 5399
Add checkpointId and triggerTime to TriggerSavepointSuccess
We can record the history of trigger checkpoint out of Flink System.
- [X] General
- The pull request
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3047
Same to FLINK-4861. close it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shijinkui closed the pull request at:
https://github.com/apache/flink/pull/3047
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3047
[FLINK-5396] [Build System] flink-dist replace scala version in opt.xâ¦
flink-dist have configured for replacing bin.xml, but not opt.xml
- [X] General
- The pull request
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3049
[FLINK-5395] [Build System] support locally build distribution by script
create_release_files.sh
create_release_files.sh is build flink release only. It's hard to build
custom local Flink
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3628
> I would also suggest to run an example at least once to make sure you
don't break anything.
You are right. Only move the example files
---
If your project is set up for it, you
GitHub user shijinkui opened a pull request:
https://github.com/apache/flink/pull/3628
[FLINK-6201][example] move python example files from resources to the
examples
Python example in the resource dir is not suitable. Move them to the
examples/python dir.
- [X] General
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3609#discussion_r108059875
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamJoin.scala
---
@@ -0,0 +1,241
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3609#discussion_r108059835
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamJoin.scala
---
@@ -0,0 +1,241
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
> I agree with @wuchong , that we should follow the example of Gelly and
add the flink-table-examples JAR file to the opt folder.
@fhueske Thanks for your review.
IMO, direct
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/2460
ping @wuchong @fhueske @twalthr
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3540
> Why exactly is it a problem if the rat-plugin checks the tools directory?
There no necessary to check the tmp flink project in the tools, because it
will extra cost some time when bu
Github user shijinkui commented on the issue:
https://github.com/apache/flink/pull/3540
> I am a bit confused... I think there is no tools/flink* directory that
would need an exclusion...
When execute `tools/create_release_files.sh`, it will clone flink project
from apa
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106590692
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/UnboundedEventTimeOverProcessFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106590822
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/UnboundedEventTimeOverProcessFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106590304
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/AggregateUtil.scala
---
@@ -91,6 +91,35 @@ object
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106590402
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/AggregateUtil.scala
---
@@ -91,6 +91,35 @@ object
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106592513
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/UnboundedEventTimeOverProcessFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106599719
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -119,6 +154,64
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106601075
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/AggregateUtil.scala
---
@@ -785,7 +785,7 @@ object
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106411491
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -136,13 +229,13
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106604502
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateWindowFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106605103
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateGlobalWindowFunction.scala
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106602970
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateWindowFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106599975
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -119,6 +154,64
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106602209
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateWindowFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106605456
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateGlobalWindowFunction.scala
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106605509
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/DataStreamProcTimeAggregateGlobalWindowFunction.scala
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106600464
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -119,6 +154,64
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106600097
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -119,6 +154,64
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3550#discussion_r106411557
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -191,3 +287,31
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106592195
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/UnboundedEventTimeOverProcessFunction.scala
---
@@ -0,0
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106590214
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/nodes/datastream/DataStreamOverAggregate.scala
---
@@ -159,6 +167,46
Github user shijinkui commented on a diff in the pull request:
https://github.com/apache/flink/pull/3386#discussion_r106589672
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/aggregate/UnboundedEventTimeOverProcessFunction.scala
---
@@ -0,0
1 - 100 of 257 matches
Mail list logo