[GitHub] [flink] flinkbot edited a comment on issue #8706: [FLINK-12814][sql-client] Support a traditional and scrolling view of…

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #8706: [FLINK-12814][sql-client] Support a 
traditional and scrolling view of…
URL: https://github.com/apache/flink/pull/8706#issuecomment-522740816
 
 
   ## CI report:
   
   * f4a31e789e78ba3d5ab18ce50c9e8e697d3141d1 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123787386)
   * 115a3d477aed92a27c71e2832357ce2e47682200 : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123990434)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-13790) Support -e option with a sql script file as input

2019-08-20 Thread Zhenghua Gao (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13790?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911941#comment-16911941
 ] 

Zhenghua Gao commented on FLINK-13790:
--

[~jark] 

-e means execute SQL from command line, without entering an interactive 
interface.

sql-client.sh embedded -e "query-string" would execute the SQL query, print 
results and exit.

> Support -e option with a sql script file as input
> -
>
> Key: FLINK-13790
> URL: https://issues.apache.org/jira/browse/FLINK-13790
> Project: Flink
>  Issue Type: Sub-task
>  Components: Command Line Client
>Reporter: Bowen Li
>Assignee: Zhenghua Gao
>Priority: Major
> Fix For: 1.10.0
>
>
> We expect user to run SQL directly on the command line. Something like: 
> sql-client embedded -f "query in string", which will execute the given file 
> without entering interactive mode
> This is related to FLINK-12828.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9450: [FLINK-13711][sql-client] Hive array values not properly displayed in…

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9450: [FLINK-13711][sql-client] Hive array 
values not properly displayed in…
URL: https://github.com/apache/flink/pull/9450#issuecomment-521552936
 
 
   ## CI report:
   
   * c9d99f2866f281298f4217e9ce7543732bece2f8 : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123334919)
   * 671aa2687e3758d16646c6fbf58e4cc486328a38 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123456040)
   * 5c25642609614012a78142672e4e11f0b028e2a8 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123488890)
   * 52289430cf4e1891b285a43d2625e908b3f2cfdf : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123659343)
   * b8ab98a90423306dd0527e36875d4b503eb5b5ec : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123707256)
   * 66d92f599e39ba430324af8b05218ad4a2227016 : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123988757)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on issue #9362: [FLINK-13354] [docs] Add documentation for how to use blink planner

2019-08-20 Thread GitBox
wuchong commented on issue #9362: [FLINK-13354] [docs] Add documentation for 
how to use blink planner
URL: https://github.com/apache/flink/pull/9362#issuecomment-523288681
 
 
   Hi @twalthr , this pull request looks good from my side. Would be great if 
you have take a look. 
   
   Another though is the "how to use blink planner" is mixed in the quite large 
"Concepts & Common API" page. I think maybe we can extract the "Differences 
Between the Two Planners" and "How to specify planner" (i.e. "Create a 
TableEnvironment") sections to the homepage of table. This will make blink 
planner more exposure and easy to find how to enable it (dependencies and APIs).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] TisonKun commented on a change in pull request #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
TisonKun commented on a change in pull request #9230: [FLINK-13430][build] 
Configure sending travis build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#discussion_r315990280
 
 

 ##
 File path: 
flink-core/src/main/java/org/apache/flink/api/common/accumulators/Accumulator.java
 ##
 @@ -1,20 +1,4 @@
-/*
 
 Review comment:
   Aha, I think we will back to the thread reduce building time :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-13790) Support -e option with a sql script file as input

2019-08-20 Thread Jark Wu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13790?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911930#comment-16911930
 ] 

Jark Wu commented on FLINK-13790:
-

What does {{-e}} means? 

> Support -e option with a sql script file as input
> -
>
> Key: FLINK-13790
> URL: https://issues.apache.org/jira/browse/FLINK-13790
> Project: Flink
>  Issue Type: Sub-task
>  Components: Command Line Client
>Reporter: Bowen Li
>Assignee: Zhenghua Gao
>Priority: Major
> Fix For: 1.10.0
>
>
> We expect user to run SQL directly on the command line. Something like: 
> sql-client embedded -f "query in string", which will execute the given file 
> without entering interactive mode
> This is related to FLINK-12828.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9469: [FLINK-13757][Docs]update logical fucntions for this expression `boolean IS NOT TRUE`

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9469: [FLINK-13757][Docs]update logical 
fucntions for this expression `boolean IS NOT TRUE`
URL: https://github.com/apache/flink/pull/9469#issuecomment-522203394
 
 
   ## CI report:
   
   * 6a793ee313851c7e8afc09f4b77534f00e23017c : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123584918)
   * b8ecce2c9e2642c295c4770123443dd796e5d5da : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123585169)
   * af1e3e3f14fc2188ea0d592486fb4c115741f8ff : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123988747)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] wuchong commented on a change in pull request #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
wuchong commented on a change in pull request #9230: [FLINK-13430][build] 
Configure sending travis build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#discussion_r315988231
 
 

 ##
 File path: 
flink-core/src/main/java/org/apache/flink/api/common/accumulators/Accumulator.java
 ##
 @@ -1,20 +1,4 @@
-/*
 
 Review comment:
   Thanks. I will revert this before merge. This is used to fail the travis 
fast. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] lirui-apache commented on a change in pull request #9457: [FLINK-13741][table] "SHOW FUNCTIONS" should include Flink built-in functions' names

2019-08-20 Thread GitBox
lirui-apache commented on a change in pull request #9457: [FLINK-13741][table] 
"SHOW FUNCTIONS" should include Flink built-in functions' names
URL: https://github.com/apache/flink/pull/9457#discussion_r315982380
 
 

 ##
 File path: 
flink-table/flink-sql-client/src/test/java/org/apache/flink/table/client/gateway/local/LocalExecutorITCase.java
 ##
 @@ -222,7 +222,7 @@ public void testListUserDefinedFunctions() throws 
Exception {
 
final List actualTables = 
executor.listUserDefinedFunctions(session);
 
-   final List expectedTables = 
Arrays.asList("aggregateUDF", "tableUDF", "scalarUDF");
+   final List expectedTables = 
Arrays.asList("aggregateUDF", "scalarUDF", "tableUDF");
 
 Review comment:
   I think either we should mention in the JavaDoc that the returned list is 
sorted, or the test should verify with a set?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9490: [hotfix][python] add System.exit() at the end of PythonGatewayServer to ensure the JVM will exit if its parent process dies.

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9490: [hotfix][python] add System.exit() at 
the end of PythonGatewayServer to ensure the JVM will exit if its parent 
process dies.
URL: https://github.com/apache/flink/pull/9490#issuecomment-522958012
 
 
   ## CI report:
   
   * 058163e892c68cb753ebe21b4ab458bcd461ccce : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123867469)
   * 6b9bb224263c0ba0d61e12e62d7bb618a0fb8c8a : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123987631)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (FLINK-13807) Flink-avro unit tests fails if the character encoding in the environment is not default to UTF-8

2019-08-20 Thread TisonKun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

TisonKun updated FLINK-13807:
-
Attachment: patch.diff

> Flink-avro unit tests fails if the character encoding in the environment is 
> not default to UTF-8
> 
>
> Key: FLINK-13807
> URL: https://issues.apache.org/jira/browse/FLINK-13807
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.8.0
>Reporter: Ethan Li
>Priority: Minor
> Attachments: patch.diff
>
>
> On Flink release-1.8 branch:
> {code:java}
> [ERROR] Tests run: 12, Failures: 4, Errors: 0, Skipped: 0, Time elapsed: 4.81 
> s <<< FAILURE! - in 
> org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest
> [ERROR] testSimpleAvroRead[Execution mode = 
> CLUSTER](org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest)  
> Time elapsed: 0.438 s  <<< FAILURE!
> java.lang.AssertionError: 
> Different elements in arrays: expected 2 elements and received 2
> files: [/tmp/junit5386344396421857812/junit6023978980792200274.tmp/4, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/2, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/1, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/3]
>  expected: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": 
> null, "type_long_test": null, "type_double_test": 123.45, "type_null_test": 
> null, "type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 
> 2"], "type_array_boolean": [true, false], "type_nullable_array": null, 
> "type_enum": "GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
> -48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
> "type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
> "type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
> "type_nullable_array": null, "type_enum": "RED", "type_map": {}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
> -48]}]
>  received: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": 
> null, "type_long_test": null, "type_double_test": 123.45, "type_null_test": 
> null, "type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 
> 2"], "type_array_boolean": [true, false], "type_nullable_array": null, 
> "type_enum": "GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": 
> [7, -48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": 
> "blue", "type_long_test": 1337, "type_double_test": 1.337, "type_null_test": 
> null, "type_bool_test": false, "type_array_string": [], "type_array_boolean": 
> [], "type_nullable_array": null, "type_enum": "RED", "type_map": {}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": 
> [7, -48]}]
>   at 
> org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest.after(AvroTypeExtractionTest.java:76)
> {code}
> Comparing “expected” with 

[jira] [Commented] (FLINK-13807) Flink-avro unit tests fails if the character encoding in the environment is not default to UTF-8

2019-08-20 Thread TisonKun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911904#comment-16911904
 ] 

TisonKun commented on FLINK-13807:
--

Thanks for report this issue [~ethanli]!

[~Zentol] I think we can address this issue by always reading the file with 
UTF-8 charset.

[~ethanli] could you try out the patch I have just attached? I test in my local 
environment and see no conflict.

I volunteer to follow up this issue.

> Flink-avro unit tests fails if the character encoding in the environment is 
> not default to UTF-8
> 
>
> Key: FLINK-13807
> URL: https://issues.apache.org/jira/browse/FLINK-13807
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.8.0
>Reporter: Ethan Li
>Priority: Minor
> Attachments: patch.diff
>
>
> On Flink release-1.8 branch:
> {code:java}
> [ERROR] Tests run: 12, Failures: 4, Errors: 0, Skipped: 0, Time elapsed: 4.81 
> s <<< FAILURE! - in 
> org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest
> [ERROR] testSimpleAvroRead[Execution mode = 
> CLUSTER](org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest)  
> Time elapsed: 0.438 s  <<< FAILURE!
> java.lang.AssertionError: 
> Different elements in arrays: expected 2 elements and received 2
> files: [/tmp/junit5386344396421857812/junit6023978980792200274.tmp/4, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/2, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/1, 
> /tmp/junit5386344396421857812/junit6023978980792200274.tmp/3]
>  expected: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": 
> null, "type_long_test": null, "type_double_test": 123.45, "type_null_test": 
> null, "type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 
> 2"], "type_array_boolean": [true, false], "type_nullable_array": null, 
> "type_enum": "GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
> -48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
> "type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
> "type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
> "type_nullable_array": null, "type_enum": "RED", "type_map": {}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
> -48]}]
>  received: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": 
> null, "type_long_test": null, "type_double_test": 123.45, "type_null_test": 
> null, "type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 
> 2"], "type_array_boolean": [true, false], "type_nullable_array": null, 
> "type_enum": "GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> "type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
> 123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": 
> [7, -48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": 
> "blue", "type_long_test": 1337, "type_double_test": 1.337, "type_null_test": 
> null, "type_bool_test": false, "type_array_string": [], "type_array_boolean": 
> [], "type_nullable_array": null, "type_enum": "RED", "type_map": {}, 
> "type_fixed": null, "type_union": null, "type_nested": {"num": 239, "street": 
> "Baker Street", "city": "London", "state": "London", "zip": "NW1 6XE"}, 
> "type_bytes": {"bytes": 
> "\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
> 2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
> 

[GitHub] [flink] flinkbot edited a comment on issue #8706: [FLINK-12814][sql-client] Support a traditional and scrolling view of…

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #8706: [FLINK-12814][sql-client] Support a 
traditional and scrolling view of…
URL: https://github.com/apache/flink/pull/8706#issuecomment-522740816
 
 
   ## CI report:
   
   * f4a31e789e78ba3d5ab18ce50c9e8e697d3141d1 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123787386)
   * 115a3d477aed92a27c71e2832357ce2e47682200 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123990434)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] TisonKun commented on a change in pull request #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
TisonKun commented on a change in pull request #9230: [FLINK-13430][build] 
Configure sending travis build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#discussion_r315977752
 
 

 ##
 File path: 
flink-core/src/main/java/org/apache/flink/api/common/accumulators/Accumulator.java
 ##
 @@ -1,20 +1,4 @@
-/*
 
 Review comment:
   Accidentally removal I think. Revert to pass the checker :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] TisonKun commented on a change in pull request #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
TisonKun commented on a change in pull request #9230: [FLINK-13430][build] 
Configure sending travis build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#discussion_r315977752
 
 

 ##
 File path: 
flink-core/src/main/java/org/apache/flink/api/common/accumulators/Accumulator.java
 ##
 @@ -1,20 +1,4 @@
-/*
 
 Review comment:
   Accidental removal I think. Revert to pass the checker :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9469: [FLINK-13757][Docs]update logical fucntions for this expression `boolean IS NOT TRUE`

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9469: [FLINK-13757][Docs]update logical 
fucntions for this expression `boolean IS NOT TRUE`
URL: https://github.com/apache/flink/pull/9469#issuecomment-522203394
 
 
   ## CI report:
   
   * 6a793ee313851c7e8afc09f4b77534f00e23017c : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123584918)
   * b8ecce2c9e2642c295c4770123443dd796e5d5da : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123585169)
   * af1e3e3f14fc2188ea0d592486fb4c115741f8ff : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123988747)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9450: [FLINK-13711][sql-client] Hive array values not properly displayed in…

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9450: [FLINK-13711][sql-client] Hive array 
values not properly displayed in…
URL: https://github.com/apache/flink/pull/9450#issuecomment-521552936
 
 
   ## CI report:
   
   * c9d99f2866f281298f4217e9ce7543732bece2f8 : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123334919)
   * 671aa2687e3758d16646c6fbf58e4cc486328a38 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123456040)
   * 5c25642609614012a78142672e4e11f0b028e2a8 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123488890)
   * 52289430cf4e1891b285a43d2625e908b3f2cfdf : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123659343)
   * b8ab98a90423306dd0527e36875d4b503eb5b5ec : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123707256)
   * 66d92f599e39ba430324af8b05218ad4a2227016 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123988757)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] hehuiyuan commented on issue #9469: [FLINK-13757][Docs]update logical fucntions for this expression `boolean IS NOT TRUE`

2019-08-20 Thread GitBox
hehuiyuan commented on issue #9469: [FLINK-13757][Docs]update logical fucntions 
for this expression `boolean IS NOT TRUE`
URL: https://github.com/apache/flink/pull/9469#issuecomment-523268556
 
 
   > @hehuiyuan @TisonKun Thanks a lot for the fix and review.
   > 
   > Besides, we should also fix the doc for java/python and scala.
   > 
   > Best, Hequn
   
   Hi , I have updated for java and scala.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] hequn8128 commented on issue #9490: [hotfix][python] add System.exit() at the end of PythonGatewayServer to ensure the JVM will exit if its parent process dies.

2019-08-20 Thread GitBox
hequn8128 commented on issue #9490: [hotfix][python] add System.exit() at the 
end of PythonGatewayServer to ensure the JVM will exit if its parent process 
dies.
URL: https://github.com/apache/flink/pull/9490#issuecomment-523268450
 
 
   @WeiZhong94 Thanks a lot for the explanation. LGTM, merging...


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-11457) PrometheusPushGatewayReporter does not cleanup its metrics

2019-08-20 Thread lamber-ken (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-11457?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911889#comment-16911889
 ] 

lamber-ken commented on FLINK-11457:


hi, [~opwvhk]

In our product env, we also met the same problem. If pushgateway can implements 
`TTL for pushed metrics`[1], it'll very useful. But for now, we use a external 
schedule system to check whether the flink job is alive or not, then delete 
metrics by pushgateway's rest api[2].

[1][https://github.com/prometheus/pushgateway/issues/19]

[2][https://github.com/prometheus/pushgateway#delete-method]

> PrometheusPushGatewayReporter does not cleanup its metrics
> --
>
> Key: FLINK-11457
> URL: https://issues.apache.org/jira/browse/FLINK-11457
> Project: Flink
>  Issue Type: Bug
>  Components: Runtime / Metrics
>Reporter: Oscar Westra van Holthe - Kind
>Priority: Major
>
> When cancelling a job running on a yarn based cluster and then shutting down 
> the cluster, metrics on the push gateway are not deleted.
> My yarn-conf.yaml settings:
> {code:yaml}
> metrics.reporters: promgateway
> metrics.reporter.promgateway.class: 
> org.apache.flink.metrics.prometheus.PrometheusPushGatewayReporter
> metrics.reporter.promgateway.host: pushgateway.gcpstg.bolcom.net
> metrics.reporter.promgateway.port: 9091
> metrics.reporter.promgateway.jobName: PSMF
> metrics.reporter.promgateway.randomJobNameSuffix: true
> metrics.reporter.promgateway.deleteOnShutdown: true
> metrics.reporter.promgateway.interval: 30 SECONDS
> {code}
> What I expect to happen:
> * when running, the metrics are pushed to the push gateway to a separate 
> label per node (jobmanager/taskmanager)
> * when shutting down, the metrics are deleted from the push gateway
> This last bit does not happen.
> How the job is run:
> {code}flink run -m yarn-cluster -yn 5 -ys 2 -yst 
> "$INSTALL_DIRECTORY/app/psmf.jar"{code} 
> How the job is stopped:
> {code}
> YARN_APP_ID=$(yarn application -list | grep "PSMF" | awk '{print $1}')
> FLINK_JOB_ID=$(flink list -r -yid ${YARN_APP_ID} | grep "PSMF" | awk '{print 
> $4}')
> flink cancel -s "${SAVEPOINT_DIR%/}/" -yid "${YARN_APP_ID}" "${FLINK_JOB_ID}"
> echo "stop" | yarn-session.sh -id ${YARN_APP_ID}
> {code} 
> Is there anything I'm sdoing wrong? Anything I can help to fix?



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (FLINK-13787) PrometheusPushGatewayReporter does not cleanup TM metrics when run on kubernetes

2019-08-20 Thread lamber-ken (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911888#comment-16911888
 ] 

lamber-ken commented on FLINK-13787:


hi, [~kaibo.zhou].

In our product env, we also met the same problem. If pushgateway can implements 
`TTL for pushed metrics`[1], it'll very useful. But for now, we use a external 
schedule system to check whether the flink job is alive or not, then delete 
metrics by pushgateway's rest api[2].

[1][https://github.com/prometheus/pushgateway/issues/19]

[2][https://github.com/prometheus/pushgateway#delete-method]

 

> PrometheusPushGatewayReporter does not cleanup TM metrics when run on 
> kubernetes
> 
>
> Key: FLINK-13787
> URL: https://issues.apache.org/jira/browse/FLINK-13787
> Project: Flink
>  Issue Type: Bug
>  Components: Runtime / Metrics
>Affects Versions: 1.7.2, 1.8.1, 1.9.0
>Reporter: Kaibo Zhou
>Priority: Major
>
> I have run a flink job on kubernetes and use PrometheusPushGatewayReporter, I 
> can see the metrics from the flink jobmanager and taskmanager from the push 
> gateway's UI.
> When I cancel the job, I found the jobmanager's metrics disappear, but the 
> taskmanager's metrics still exist, even though I have set the 
> _deleteOnShutdown_ to true_._
> The configuration is:
> {code:java}
> metrics.reporters: "prom"
> metrics.reporter.prom.class: 
> "org.apache.flink.metrics.prometheus.PrometheusPushGatewayReporter"
> metrics.reporter.prom.jobName: "WordCount"
> metrics.reporter.prom.host: "localhost"
> metrics.reporter.prom.port: "9091"
> metrics.reporter.prom.randomJobNameSuffix: "true"
> metrics.reporter.prom.filterLabelValueCharacters: "true"
> metrics.reporter.prom.deleteOnShutdown: "true"
> {code}
>  
> Other people have also encountered this problem: 
> [https://stackoverflow.com/questions/54420498/flink-prometheus-push-gateway-reporter-delete-metrics-on-job-shutdown].
>   And another similar issue: FLINK-11457.
>  
> As prometheus is a very import metrics system on kubernetes, if we can solve 
> this problem, it is beneficial for users to monitor their flink jobs.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9490: [hotfix][python] add System.exit() at the end of PythonGatewayServer to ensure the JVM will exit if its parent process dies.

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9490: [hotfix][python] add System.exit() at 
the end of PythonGatewayServer to ensure the JVM will exit if its parent 
process dies.
URL: https://github.com/apache/flink/pull/9490#issuecomment-522958012
 
 
   ## CI report:
   
   * 058163e892c68cb753ebe21b4ab458bcd461ccce : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123867469)
   * 6b9bb224263c0ba0d61e12e62d7bb618a0fb8c8a : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123987631)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (FLINK-13757) Document error for `logical functions`

2019-08-20 Thread Hequn Cheng (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13757?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng reassigned FLINK-13757:
---

Assignee: hehuiyuan

> Document error for  `logical functions`
> ---
>
> Key: FLINK-13757
> URL: https://issues.apache.org/jira/browse/FLINK-13757
> Project: Flink
>  Issue Type: Wish
>  Components: Documentation
>Reporter: hehuiyuan
>Assignee: hehuiyuan
>Priority: Major
>  Labels: pull-request-available
> Attachments: image-2019-08-17-11-58-53-247.png
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> [https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/functions.html#logical-functions]
> False:
> |{{boolean IS NOT TRUE}}|Returns TRUE if _boolean_ is FALSE or UNKNOWN; 
> returns FALSE if _boolean_ is *FALSE*.|
> True:
> |{{boolean IS NOT TRUE}}|Returns TRUE if _boolean_ is FALSE or UNKNOWN; 
> returns FALSE if _boolean_ is *TRUE*.|
> [!image-2019-08-17-11-58-53-247.png!|https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/functions.html#logical-functions]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (FLINK-13792) source and sink support manual rate limit

2019-08-20 Thread boshu Zheng (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13792?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911879#comment-16911879
 ] 

boshu Zheng commented on FLINK-13792:
-

+1 for this feature :)

> source and sink support manual rate limit
> -
>
> Key: FLINK-13792
> URL: https://issues.apache.org/jira/browse/FLINK-13792
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / Common
>Affects Versions: 1.8.1
>Reporter: zzsmdfj
>Priority: Major
>
> in current flink implement automatic flow control by back pressure, it is 
> efficient for the most scene, but in some special scenario, do we need 
> fine-grained flow control to avoid impact on other systems? For example: if i 
> have window with days(a lot of datas), then do call ProcessWindowFunction 
> when trigger, this will produce a lot of data to sink, if sink to message 
> queue, it can have a huge impact to message queue. so if there is sink rate 
> limiter, it is friendly to external system. for source rate limiter, it is 
> appropriate for having window operator and accumulating a large amount of 
> historical data.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] WeiZhong94 commented on issue #9490: [hotfix][python] add System.exit() at the end of PythonGatewayServer to ensure the JVM will exit if its parent process dies.

2019-08-20 Thread GitBox
WeiZhong94 commented on issue #9490: [hotfix][python] add System.exit() at the 
end of PythonGatewayServer to ensure the JVM will exit if its parent process 
dies.
URL: https://github.com/apache/flink/pull/9490#issuecomment-523264436
 
 
   @hequn8128  Thanks a lot for your review. This change is to make sure the 
JVM could exit after the Python process exits even if there are some threads 
running in the JVM. System.exit is a special method provided to terminates the 
currently running JVM, and the test seems not necessary for me as it seems that 
we are trying to test the behavior of the JVM. Does that make sense to you?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9496: [FLINK-13011][build] Add the Build logic for Python API release package

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9496: [FLINK-13011][build] Add the Build 
logic for Python API release package
URL: https://github.com/apache/flink/pull/9496#issuecomment-523246052
 
 
   ## CI report:
   
   * 9bf7d60a41a45eab26d587e937cb8fd094983404 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123981982)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] TisonKun commented on a change in pull request #9493: FLINK-13797: Add missing format argument

2019-08-20 Thread GitBox
TisonKun commented on a change in pull request #9493: FLINK-13797: Add missing 
format argument
URL: https://github.com/apache/flink/pull/9493#discussion_r315964585
 
 

 ##
 File path: 
flink-filesystems/flink-hadoop-fs/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopFsFactory.java
 ##
 @@ -118,7 +118,12 @@ else if (flinkConfig != null) {
initUri = fsUri;
}
else {
-   LOG.debug("URI {} does not specify file system 
authority, trying to load default authority (fs.defaultFS)");
+   if (LOG.isDebugEnabled()) {
+   LOG.debug(
+   "URI {} does not specify file 
system authority, trying to load default authority (fs.defaultFS)",
+   fsUri.toString()
 
 Review comment:
   `.toString` is no need here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9210: [FLINK-12746][docs] Getting Started - 
DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#issuecomment-514437706
 
 
   ## CI report:
   
   * 5eb979da047c442c0205464c92b5bd9ee3a740dc : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120299964)
   * d7bf53a30514664925357bd5817305a02553d0a3 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120506936)
   * 02cca7fb6283b84a20ee019159ccb023ccffbd82 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120769129)
   * 5009b10d38eef92f25bfe4ff4608f2dd121ea9c6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120915709)
   * e3b272586d8f41d3800e86134730c4dc427952a6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120916220)
   * f1aee543a7aef88e3cf052f4d686ab0a8e5938e5 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120996260)
   * c66060dba290844085f90f554d447c6d7033779d : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/121131224)
   * 700e5c19a3d49197ef2b18a646f0b6e1bf783ba8 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/121174288)
   * 6f3fccea82189ef95d46f12212f6f7386fc11668 : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123540519)
   * 829c9c0505b6f08bb68e20a34e0613d83ae21758 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123545553)
   * 6f4f9ad2b9840347bda3474fe18f4b6b0b870c01 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123789816)
   * 8df2872cf6f575acbacbc8aff510c67dccfa2931 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123979061)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] TisonKun commented on issue #9492: FLINK-13796: Remove unused variable

2019-08-20 Thread GitBox
TisonKun commented on issue #9492: FLINK-13796: Remove unused variable
URL: https://github.com/apache/flink/pull/9492#issuecomment-523253881
 
 
   Thanks for your contribution @Fokko !
   
   Please update the description of pull request as it guides.
   
   @xintongsong Could you take a look that whether it is a safe removal?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-13757) Document error for `logical functions`

2019-08-20 Thread TisonKun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-13757?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911838#comment-16911838
 ] 

TisonKun commented on FLINK-13757:
--

Thanks for your report and the fix [~hehuiyuan]!

[~rmetzger] [~fhueske] could you take a look at this issue?

> Document error for  `logical functions`
> ---
>
> Key: FLINK-13757
> URL: https://issues.apache.org/jira/browse/FLINK-13757
> Project: Flink
>  Issue Type: Wish
>  Components: Documentation
>Reporter: hehuiyuan
>Priority: Major
>  Labels: pull-request-available
> Attachments: image-2019-08-17-11-58-53-247.png
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> [https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/functions.html#logical-functions]
> False:
> |{{boolean IS NOT TRUE}}|Returns TRUE if _boolean_ is FALSE or UNKNOWN; 
> returns FALSE if _boolean_ is *FALSE*.|
> True:
> |{{boolean IS NOT TRUE}}|Returns TRUE if _boolean_ is FALSE or UNKNOWN; 
> returns FALSE if _boolean_ is *TRUE*.|
> [!image-2019-08-17-11-58-53-247.png!|https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/functions.html#logical-functions]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot commented on issue #9496: [FLINK-13011][build] Add the Build logic for Python API release package

2019-08-20 Thread GitBox
flinkbot commented on issue #9496: [FLINK-13011][build] Add the Build logic for 
Python API release package
URL: https://github.com/apache/flink/pull/9496#issuecomment-523246052
 
 
   ## CI report:
   
   * 9bf7d60a41a45eab26d587e937cb8fd094983404 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123981982)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot commented on issue #9496: [FLINK-13011][build] Add the Build logic for Python API release package

2019-08-20 Thread GitBox
flinkbot commented on issue #9496: [FLINK-13011][build] Add the Build logic for 
Python API release package
URL: https://github.com/apache/flink/pull/9496#issuecomment-52325
 
 
   Thanks a lot for your contribution to the Apache Flink project. I'm the 
@flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress 
of the review.
   
   
   ## Automated Checks
   Last check on commit 9bf7d60a41a45eab26d587e937cb8fd094983404 (Wed Aug 21 
00:18:44 UTC 2019)
   
   **Warnings:**
* **1 pom.xml files were touched**: Check for build and licensing issues.
* No documentation files were touched! Remember to keep the Flink docs up 
to date!
   
   
   Mention the bot in a comment to re-run the automated checks.
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review 
Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full 
explanation of the review process.
The Bot is tracking the review progress through labels. Labels are applied 
according to the order of the review items. For consensus, approval by a Flink 
committer of PMC member is required Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot approve description` to approve one or more aspects (aspects: 
`description`, `consensus`, `architecture` and `quality`)
- `@flinkbot approve all` to approve all aspects
- `@flinkbot approve-until architecture` to approve everything until 
`architecture`
- `@flinkbot attention @username1 [@username2 ..]` to require somebody's 
attention
- `@flinkbot disapprove architecture` to remove an approval you gave earlier
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (FLINK-13011) Release the PyFlink into PyPI

2019-08-20 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-13011:
---
Labels: pull-request-available  (was: )

> Release the PyFlink into PyPI
> -
>
> Key: FLINK-13011
> URL: https://issues.apache.org/jira/browse/FLINK-13011
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Python, Build System
>Affects Versions: 1.9.0
>Reporter: sunjincheng
>Assignee: sunjincheng
>Priority: Major
>  Labels: pull-request-available
>
> FLINK-12962 adds the ability to build a PyFlink distribution package, but we 
> have not yet released PyFlink to PyPI. The goal of JIRA is to publish the 
> PyFlinjk distribution package built by FLINK-12962 to PyPI.
> [https://pypi.org/]
>  [https://packaging.python.org/tutorials/packaging-projects/]
> The changes list:
>  
> 1. Create PyPI Project for Apache Flink Python API, named: "apache-flink"
> 2. Release one binary with the default Scala version same with flink default 
> config.
> 3. Create an account, named "pyflink" as owner(only PMC can manage it). PMC 
> can add account for the Release Manager, but Release Manager can not delete 
> the release.
>  
> The Discusstion mail thread : 
> [http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/DISCUSS-Publish-the-PyFlink-into-PyPI-td30095.html]
> The VOTE mail thread: 
> [http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/VOTE-Publish-the-PyFlink-into-PyPI-td31201.html]
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] sunjincheng121 opened a new pull request #9496: [FLINK-13011][build] Add the Build logic for Python API release package

2019-08-20 Thread GitBox
sunjincheng121 opened a new pull request #9496: [FLINK-13011][build] Add the 
Build logic for Python API release package
URL: https://github.com/apache/flink/pull/9496
 
 
   ## What is the purpose of the change
   In this JIRA we add the Build logic for Python API release package.
   
   ## Brief change log
   
 -  clear up the build dir for flink-python.
 -  add build python api release logic in create_binary_release.sh.
 - add exclude config for flink-python in create_source_release.sh.
   
   ## Verifying this change
   This change is a release script changes without any test coverage.
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): (no)
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (no)
 - The serializers: (no)
 - The runtime per-record code paths (performance sensitive): (no)
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
 - The S3 file system connector: (no)
   
   ## Documentation
   
 - Does this pull request introduce a new feature? (no)
 - If yes, how is the feature documented? (not documented)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9210: [FLINK-12746][docs] Getting Started - 
DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#issuecomment-514437706
 
 
   ## CI report:
   
   * 5eb979da047c442c0205464c92b5bd9ee3a740dc : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120299964)
   * d7bf53a30514664925357bd5817305a02553d0a3 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120506936)
   * 02cca7fb6283b84a20ee019159ccb023ccffbd82 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120769129)
   * 5009b10d38eef92f25bfe4ff4608f2dd121ea9c6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120915709)
   * e3b272586d8f41d3800e86134730c4dc427952a6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120916220)
   * f1aee543a7aef88e3cf052f4d686ab0a8e5938e5 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120996260)
   * c66060dba290844085f90f554d447c6d7033779d : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/121131224)
   * 700e5c19a3d49197ef2b18a646f0b6e1bf783ba8 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/121174288)
   * 6f3fccea82189ef95d46f12212f6f7386fc11668 : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123540519)
   * 829c9c0505b6f08bb68e20a34e0613d83ae21758 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123545553)
   * 6f4f9ad2b9840347bda3474fe18f4b6b0b870c01 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123789816)
   * 8df2872cf6f575acbacbc8aff510c67dccfa2931 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123979061)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] sjwiesman commented on issue #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
sjwiesman commented on issue #9210: [FLINK-12746][docs] Getting Started - 
DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#issuecomment-523236371
 
 
   @NicoK I consolidated on spaces and the comments you had on the wording / 
missing information. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] sjwiesman commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
sjwiesman commented on a change in pull request #9210: [FLINK-12746][docs] 
Getting Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315947510
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,897 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
+
+DataStream transactions = env
+.addSource(new TransactionSource())
+.name("transactions");
+
+

[GitHub] [flink] bowenli86 commented on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" should include Flink built-in functions' names

2019-08-20 Thread GitBox
bowenli86 commented on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" 
should include Flink built-in functions' names
URL: https://github.com/apache/flink/pull/9457#issuecomment-523235668
 
 
   cc @xuefuz @lirui-apache @zjuwangg @twalthr @sunjincheng121 
   
   The UT failure of BatchFineGrainedRecoveryITCase.testProgram is irrelevant.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] sjwiesman commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
sjwiesman commented on a change in pull request #9210: [FLINK-12746][docs] 
Getting Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315946232
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple OperatorChain with StreamStatusMaintainer

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple 
OperatorChain with StreamStatusMaintainer
URL: https://github.com/apache/flink/pull/9471#issuecomment-522269622
 
 
   ## CI report:
   
   * 46356e9f2ac97632021b3450f2585ea8b6120175 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123609454)
   * 330c8be5df79465a8804b7059c104984c6ac43ad : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123644155)
   * df762cf9c977de44fda9ccd5f805e7784c7dff6f : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123807523)
   * d162b3fa56334555fa2dc8c5489d5656a070f0a4 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123842993)
   * 0760d8a8f27710ff4d7319269709cfb23e8f0317 : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123963758)
   * 87bec42731cfbc1cd399ca649fe233f5a8823de3 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123969495)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9475: [FLINK-13762][task] Implement a unified ForwardingValveOutputHandler for StreamOne/TwoInputProcessor

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9475: [FLINK-13762][task] Implement a 
unified ForwardingValveOutputHandler for StreamOne/TwoInputProcessor
URL: https://github.com/apache/flink/pull/9475#issuecomment-522352954
 
 
   ## CI report:
   
   * 7048a7504f200a099b5e4e42844888f51ba604aa : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123645048)
   * ec6450b37a2986aad216f00be0dbdee639ec2ffb : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123681958)
   * bda36681e3b81deb0b2e48ebc72b230adfb6830b : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123849658)
   * b8ba07280cbb82ae556c8968ce9e2e1f1783d2ad : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123971033)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9475: [FLINK-13762][task] Implement a unified ForwardingValveOutputHandler for StreamOne/TwoInputProcessor

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9475: [FLINK-13762][task] Implement a 
unified ForwardingValveOutputHandler for StreamOne/TwoInputProcessor
URL: https://github.com/apache/flink/pull/9475#issuecomment-522352954
 
 
   ## CI report:
   
   * 7048a7504f200a099b5e4e42844888f51ba604aa : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123645048)
   * ec6450b37a2986aad216f00be0dbdee639ec2ffb : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123681958)
   * bda36681e3b81deb0b2e48ebc72b230adfb6830b : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123849658)
   * b8ba07280cbb82ae556c8968ce9e2e1f1783d2ad : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123971033)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple OperatorChain with StreamStatusMaintainer

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple 
OperatorChain with StreamStatusMaintainer
URL: https://github.com/apache/flink/pull/9471#issuecomment-522269622
 
 
   ## CI report:
   
   * 46356e9f2ac97632021b3450f2585ea8b6120175 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123609454)
   * 330c8be5df79465a8804b7059c104984c6ac43ad : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123644155)
   * df762cf9c977de44fda9ccd5f805e7784c7dff6f : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123807523)
   * d162b3fa56334555fa2dc8c5489d5656a070f0a4 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123842993)
   * 0760d8a8f27710ff4d7319269709cfb23e8f0317 : CANCELED 
[Build](https://travis-ci.com/flink-ci/flink/builds/123963758)
   * 87bec42731cfbc1cd399ca649fe233f5a8823de3 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123969495)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (FLINK-13754) Decouple OperatorChain with StreamStatusMaintainer

2019-08-20 Thread zhijiang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13754?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhijiang updated FLINK-13754:
-
Description: 
The current OperatorChain is heavy-weight to take some unrelated roles like 
_StreamStatusMaintainer_. If other components rely on the 
_StreamStatusMaintainer_, we have to pass the whole OperatorChain. From the 
design aspect of single function, we need to decouple _StreamStatusMaintainer_ 
from _OperatorChain_.

Based on FLINK-13798 to break the cycle dependency between _RecordWriterOutput_ 
and _StreamStatusMaintainer,_ it is reasonable to create/maintain a separate 
instance of _StreamStatusMaintainer_ on StreamTask side.

One reason is that it gets benefits of removing some duplicate logics while 
creating _RecordWriter_ and _RecordWriterOutput_ together.

Another reason is that it is easy to reference the _StreamStatusMaintainer_ via 
_StreamTask_ in the following refactoring work.

  was:
The current OperatorChain is heavy-weight to take some unrelated roles like 
StreamStatusMaintainer. If other components only rely on the 
StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
design aspect of single function, we need to decouple StreamStatusMaintainer 
from OperatorChain.

The solution is to refactor the creation of StreamStatusMaintainer and 
RecordWriterOutput in StreamTask level, and then break the implementation cycle 
dependency between them. The array of RecordWriters which has close 
relationship with RecordWriterOutput is created in StreamTask, so it is 
reasonable to create them together. The created StreamStatusMaintainer in 
StreamTask can be directly referenced by subclasses like 
OneInputStreamTask/TwoInputStreamTask.


> Decouple OperatorChain with StreamStatusMaintainer
> --
>
> Key: FLINK-13754
> URL: https://issues.apache.org/jira/browse/FLINK-13754
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Task
>Reporter: zhijiang
>Assignee: zhijiang
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The current OperatorChain is heavy-weight to take some unrelated roles like 
> _StreamStatusMaintainer_. If other components rely on the 
> _StreamStatusMaintainer_, we have to pass the whole OperatorChain. From the 
> design aspect of single function, we need to decouple 
> _StreamStatusMaintainer_ from _OperatorChain_.
> Based on FLINK-13798 to break the cycle dependency between 
> _RecordWriterOutput_ and _StreamStatusMaintainer,_ it is reasonable to 
> create/maintain a separate instance of _StreamStatusMaintainer_ on StreamTask 
> side.
> One reason is that it gets benefits of removing some duplicate logics while 
> creating _RecordWriter_ and _RecordWriterOutput_ together.
> Another reason is that it is easy to reference the _StreamStatusMaintainer_ 
> via _StreamTask_ in the following refactoring work.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-13754) Decouple OperatorChain with StreamStatusMaintainer

2019-08-20 Thread zhijiang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13754?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhijiang updated FLINK-13754:
-
Description: 
The current OperatorChain is heavy-weight to take some unrelated roles like 
StreamStatusMaintainer. If other components only rely on the 
StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
design aspect of single function, we need to decouple StreamStatusMaintainer 
from OperatorChain.

The solution is to refactor the creation of StreamStatusMaintainer and 
RecordWriterOutput in StreamTask level, and then break the implementation cycle 
dependency between them. The array of RecordWriters which has close 
relationship with RecordWriterOutput is created in StreamTask, so it is 
reasonable to create them together. The created StreamStatusMaintainer in 
StreamTask can be directly referenced by subclasses like 
OneInputStreamTask/TwoInputStreamTask.

  was:
The current OperatorChain is heavy-weight to take some unrelated roles like 
StreamStatusMaintainer. If other components only rely on the 
StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
design aspect of single function, we need to decouple 

The solution is to refactor the creation of StreamStatusMaintainer and 
RecordWriterOutput in StreamTask level, and then break the implementation cycle 
dependency between them. The array of RecordWriters which has close 
relationship with RecordWriterOutput is created in StreamTask, so it is 
reasonable to create them together. The created StreamStatusMaintainer in 
StreamTask can be directly referenced by subclasses like 
OneInputStreamTask/TwoInputStreamTask.


> Decouple OperatorChain with StreamStatusMaintainer
> --
>
> Key: FLINK-13754
> URL: https://issues.apache.org/jira/browse/FLINK-13754
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Task
>Reporter: zhijiang
>Assignee: zhijiang
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The current OperatorChain is heavy-weight to take some unrelated roles like 
> StreamStatusMaintainer. If other components only rely on the 
> StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
> design aspect of single function, we need to decouple StreamStatusMaintainer 
> from OperatorChain.
> The solution is to refactor the creation of StreamStatusMaintainer and 
> RecordWriterOutput in StreamTask level, and then break the implementation 
> cycle dependency between them. The array of RecordWriters which has close 
> relationship with RecordWriterOutput is created in StreamTask, so it is 
> reasonable to create them together. The created StreamStatusMaintainer in 
> StreamTask can be directly referenced by subclasses like 
> OneInputStreamTask/TwoInputStreamTask.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-13754) Decouple OperatorChain with StreamStatusMaintainer

2019-08-20 Thread zhijiang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13754?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhijiang updated FLINK-13754:
-
Summary: Decouple OperatorChain with StreamStatusMaintainer  (was: Decouple 
OperatorChain from StreamStatusMaintainer)

> Decouple OperatorChain with StreamStatusMaintainer
> --
>
> Key: FLINK-13754
> URL: https://issues.apache.org/jira/browse/FLINK-13754
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Task
>Reporter: zhijiang
>Assignee: zhijiang
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The current OperatorChain is heavy-weight to take some unrelated roles like 
> StreamStatusMaintainer. If other components only rely on the 
> StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
> design aspect of single function, we need to decouple 
> The solution is to refactor the creation of StreamStatusMaintainer and 
> RecordWriterOutput in StreamTask level, and then break the implementation 
> cycle dependency between them. The array of RecordWriters which has close 
> relationship with RecordWriterOutput is created in StreamTask, so it is 
> reasonable to create them together. The created StreamStatusMaintainer in 
> StreamTask can be directly referenced by subclasses like 
> OneInputStreamTask/TwoInputStreamTask.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-13754) Decouple OperatorChain from StreamStatusMaintainer

2019-08-20 Thread zhijiang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13754?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhijiang updated FLINK-13754:
-
Description: 
The current OperatorChain is heavy-weight to take some unrelated roles like 
StreamStatusMaintainer. If other components only rely on the 
StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
design aspect of single function, we need to decouple 

The solution is to refactor the creation of StreamStatusMaintainer and 
RecordWriterOutput in StreamTask level, and then break the implementation cycle 
dependency between them. The array of RecordWriters which has close 
relationship with RecordWriterOutput is created in StreamTask, so it is 
reasonable to create them together. The created StreamStatusMaintainer in 
StreamTask can be directly referenced by subclasses like 
OneInputStreamTask/TwoInputStreamTask.

  was:
There are two motivations for this refactoring:
 * It is the precondition for the following work of decoupling the dependency 
between two inputs status in ForwardingValveOutputHandler.
 * From the aspect of design rule, the current OperatorChain takes many 
unrelated roles like StreamStatusMaintainer to make it unmaintainable. The root 
reason for this case is from the cycle dependency between RecordWriterOutput 
(created by OperatorChain) and  StreamStatusMaintainer.

The solution is to refactor the creation of StreamStatusMaintainer and 
RecordWriterOutput in StreamTask level, and then break the implementation cycle 
dependency between them. The array of RecordWriters which has close 
relationship with RecordWriterOutput is created in StreamTask, so it is 
reasonable to create them together. The created StreamStatusMaintainer in 
StreamTask can be directly referenced by subclasses like 
OneInputStreamTask/TwoInputStreamTask.


> Decouple OperatorChain from StreamStatusMaintainer
> --
>
> Key: FLINK-13754
> URL: https://issues.apache.org/jira/browse/FLINK-13754
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Task
>Reporter: zhijiang
>Assignee: zhijiang
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The current OperatorChain is heavy-weight to take some unrelated roles like 
> StreamStatusMaintainer. If other components only rely on the 
> StreamStatusMaintainer, we have to pass the whole OperatorChain. From the 
> design aspect of single function, we need to decouple 
> The solution is to refactor the creation of StreamStatusMaintainer and 
> RecordWriterOutput in StreamTask level, and then break the implementation 
> cycle dependency between them. The array of RecordWriters which has close 
> relationship with RecordWriterOutput is created in StreamTask, so it is 
> reasonable to create them together. The created StreamStatusMaintainer in 
> StreamTask can be directly referenced by subclasses like 
> OneInputStreamTask/TwoInputStreamTask.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-13807) Flink-avro unit tests fails if the character encoding in the environment is not default to UTF-8

2019-08-20 Thread Ethan Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-13807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ethan Li updated FLINK-13807:
-
Description: 
On Flink release-1.8 branch:
{code:java}
[ERROR] Tests run: 12, Failures: 4, Errors: 0, Skipped: 0, Time elapsed: 4.81 s 
<<< FAILURE! - in org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest
[ERROR] testSimpleAvroRead[Execution mode = 
CLUSTER](org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest)  Time 
elapsed: 0.438 s  <<< FAILURE!
java.lang.AssertionError: 
Different elements in arrays: expected 2 elements and received 2
files: [/tmp/junit5386344396421857812/junit6023978980792200274.tmp/4, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/2, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/1, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/3]
 expected: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": null, 
"type_long_test": null, "type_double_test": 123.45, "type_null_test": null, 
"type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 2"], 
"type_array_boolean": [true, false], "type_nullable_array": null, "type_enum": 
"GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, "type_fixed": null, 
"type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
-48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
"type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
"type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
"type_nullable_array": null, "type_enum": "RED", "type_map": {}, "type_fixed": 
null, "type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
-48]}]
 received: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": null, 
"type_long_test": null, "type_double_test": 123.45, "type_null_test": null, 
"type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 2"], 
"type_array_boolean": [true, false], "type_nullable_array": null, "type_enum": 
"GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, "type_fixed": null, 
"type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": [7, 
-48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
"type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
"type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
"type_nullable_array": null, "type_enum": "RED", "type_map": {}, "type_fixed": 
null, "type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": [7, 
-48]}]
at 
org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest.after(AvroTypeExtractionTest.java:76)
{code}

Comparing “expected” with “received”, there is really some question mark 
difference.

For example, in “expected’, it’s
{code:java}
"type_decimal_bytes": {"bytes": "\u0007?”}
{code}

While in “received”, it’s 
{code:java}
"type_decimal_bytes": {"bytes": "\u0007??"}
{code}

The environment I ran the unit tests on uses ANSI_X3.4-1968 

I changed to "en_US.UTF-8" and the unit tests passed. 

  was:

{code:java}
[ERROR] Tests run: 12, Failures: 4, Errors: 0, Skipped: 0, Time elapsed: 4.81 s 
<<< FAILURE! - in org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest
[ERROR] testSimpleAvroRead[Execution mode = 

[GitHub] [flink] flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple OperatorChain from StreamStatusMaintainer

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9471: [FLINK-13754][task] Decouple 
OperatorChain from StreamStatusMaintainer
URL: https://github.com/apache/flink/pull/9471#issuecomment-522269622
 
 
   ## CI report:
   
   * 46356e9f2ac97632021b3450f2585ea8b6120175 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123609454)
   * 330c8be5df79465a8804b7059c104984c6ac43ad : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123644155)
   * df762cf9c977de44fda9ccd5f805e7784c7dff6f : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123807523)
   * d162b3fa56334555fa2dc8c5489d5656a070f0a4 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123842993)
   * 0760d8a8f27710ff4d7319269709cfb23e8f0317 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123963758)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (FLINK-13807) Flink-avro unit tests fails if the character encoding in the environment is not default to UTF-8

2019-08-20 Thread Ethan Li (Jira)
Ethan Li created FLINK-13807:


 Summary: Flink-avro unit tests fails if the character encoding in 
the environment is not default to UTF-8
 Key: FLINK-13807
 URL: https://issues.apache.org/jira/browse/FLINK-13807
 Project: Flink
  Issue Type: Bug
Affects Versions: 1.8.0
Reporter: Ethan Li



{code:java}
[ERROR] Tests run: 12, Failures: 4, Errors: 0, Skipped: 0, Time elapsed: 4.81 s 
<<< FAILURE! - in org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest
[ERROR] testSimpleAvroRead[Execution mode = 
CLUSTER](org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest)  Time 
elapsed: 0.438 s  <<< FAILURE!
java.lang.AssertionError: 
Different elements in arrays: expected 2 elements and received 2
files: [/tmp/junit5386344396421857812/junit6023978980792200274.tmp/4, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/2, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/1, 
/tmp/junit5386344396421857812/junit6023978980792200274.tmp/3]
 expected: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": null, 
"type_long_test": null, "type_double_test": 123.45, "type_null_test": null, 
"type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 2"], 
"type_array_boolean": [true, false], "type_nullable_array": null, "type_enum": 
"GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, "type_fixed": null, 
"type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
-48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
"type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
"type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
"type_nullable_array": null, "type_enum": "RED", "type_map": {}, "type_fixed": 
null, "type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007?"}, "type_decimal_fixed": [7, 
-48]}]
 received: [{"name": "Alyssa", "favorite_number": 256, "favorite_color": null, 
"type_long_test": null, "type_double_test": 123.45, "type_null_test": null, 
"type_bool_test": true, "type_array_string": ["ELEMENT 1", "ELEMENT 2"], 
"type_array_boolean": [true, false], "type_nullable_array": null, "type_enum": 
"GREEN", "type_map": {"KEY 2": 17554, "KEY 1": 8546456}, "type_fixed": null, 
"type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": [7, 
-48]}, {"name": "Charlie", "favorite_number": null, "favorite_color": "blue", 
"type_long_test": 1337, "type_double_test": 1.337, "type_null_test": null, 
"type_bool_test": false, "type_array_string": [], "type_array_boolean": [], 
"type_nullable_array": null, "type_enum": "RED", "type_map": {}, "type_fixed": 
null, "type_union": null, "type_nested": {"num": 239, "street": "Baker Street", 
"city": "London", "state": "London", "zip": "NW1 6XE"}, "type_bytes": {"bytes": 
"\u\u\u\u\u\u\u\u\u\u"}, "type_date": 
2014-03-01, "type_time_millis": 12:12:12.000, "type_time_micros": 123456, 
"type_timestamp_millis": 2014-03-01T12:12:12.321Z, "type_timestamp_micros": 
123456, "type_decimal_bytes": {"bytes": "\u0007??"}, "type_decimal_fixed": [7, 
-48]}]
at 
org.apache.flink.formats.avro.typeutils.AvroTypeExtractionTest.after(AvroTypeExtractionTest.java:76)
{code}

Comparing “expected” with “received”, there is really some question mark 
difference.

For example, in “expected’, it’s
{code:java}
"type_decimal_bytes": {"bytes": "\u0007?”}
{code}

While in “received”, it’s 
{code:java}
"type_decimal_bytes": {"bytes": "\u0007??"}
{code}

The environment I ran the unit tests on uses ANSI_X3.4-1968 

I changed to "en_US.UTF-8" and the unit tests passed. 



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Created] (FLINK-13806) Metric Fetcher floods the JM log with errors when TM is lost

2019-08-20 Thread Stephan Ewen (Jira)
Stephan Ewen created FLINK-13806:


 Summary: Metric Fetcher floods the JM log with errors when TM is 
lost
 Key: FLINK-13806
 URL: https://issues.apache.org/jira/browse/FLINK-13806
 Project: Flink
  Issue Type: Bug
  Components: Runtime / Metrics
Affects Versions: 1.9.0
Reporter: Stephan Ewen
 Fix For: 1.10.0, 1.9.1


When a task manager is lost, the log contains a series of exceptions from the 
metrics fetcher, making it hard to identify the exceptions from the actual job 
failure.

The exception below is contained multiple time (in my example eight times) in a 
simple 4 TM setup after one TM failure.

I would suggest to suppress "failed asks" (timeouts) from the metrics fetcher 
service, because the fetcher has not enough information to distinguish between 
root cause exceptions and follow-up exceptions. In most cases, these exceptions 
should be follow-up to a failure that is handled in the 
scheduler/ExecutionGraph already, and the additional exception logging only add 
noise to the log.

{code}
2019-08-20 22:00:09,865 WARN  
org.apache.flink.runtime.rest.handler.legacy.metrics.MetricFetcherImpl  - 
Requesting TaskManager's path for query services failed.
java.util.concurrent.CompletionException: akka.pattern.AskTimeoutException: Ask 
timed out on [Actor[akka://flink/user/dispatcher#-1834666306]] after [1 
ms]. Message of type 
[org.apache.flink.runtime.rpc.messages.LocalFencedMessage]. A typical reason 
for `AskTimeoutException` is that the recipient actor didn't send a reply.
at 
java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292)
at 
java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308)
at 
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:593)
at 
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577)
at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at 
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at 
org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:871)
at akka.dispatch.OnComplete.internal(Future.scala:263)
at akka.dispatch.OnComplete.internal(Future.scala:261)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
at 
org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74)
at 
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
at 
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
at 
akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:644)
at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:205)
at 
scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
at 
scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109)
at 
scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
at 
akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:328)
at 
akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:279)
at 
akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:283)
at 
akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:235)
at java.lang.Thread.run(Thread.java:748)
Caused by: akka.pattern.AskTimeoutException: Ask timed out on 
[Actor[akka://flink/user/dispatcher#-1834666306]] after [1 ms]. Message of 
type [org.apache.flink.runtime.rpc.messages.LocalFencedMessage]. A typical 
reason for `AskTimeoutException` is that the recipient actor didn't send a 
reply.
at akka.pattern.PromiseActorRef$$anonfun$2.apply(AskSupport.scala:635)
at akka.pattern.PromiseActorRef$$anonfun$2.apply(AskSupport.scala:635)
at 
akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:648)
... 9 more
{code}




--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" should include Flink built-in functions' names

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" 
should include Flink built-in functions' names
URL: https://github.com/apache/flink/pull/9457#issuecomment-521829752
 
 
   ## CI report:
   
   * 55c0e5843e029f022ff59fe14a9e6c1d2c5ac69e : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123443311)
   * 006236fff94d0204223a2c3b89f621da3248f6a4 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123444248)
   * 726259a0a1bfb2061f77a82c586d9b3a4c70abb6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123572731)
   * 34cf90b067d6de0c0ffd72ac380fe66d07725a88 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123766959)
   * 2829fa50fcfd02378f736b9f7ef9b52af6090cde : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123942470)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9454: [FLINK-13644][docs-zh] Translate "State Backends" page into Chinese

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9454: [FLINK-13644][docs-zh] Translate 
"State Backends" page into Chinese
URL: https://github.com/apache/flink/pull/9454#issuecomment-521630829
 
 
   ## CI report:
   
   * a68ae4c218cbb02c7e2c3e1422ac89909d7e05bb : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123366700)
   * c0a6452df2199975203a9dd1fe1f645af1562c06 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123937629)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" should include Flink built-in functions' names

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9457: [FLINK-13741][table] "SHOW FUNCTIONS" 
should include Flink built-in functions' names
URL: https://github.com/apache/flink/pull/9457#issuecomment-521829752
 
 
   ## CI report:
   
   * 55c0e5843e029f022ff59fe14a9e6c1d2c5ac69e : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123443311)
   * 006236fff94d0204223a2c3b89f621da3248f6a4 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123444248)
   * 726259a0a1bfb2061f77a82c586d9b3a4c70abb6 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123572731)
   * 34cf90b067d6de0c0ffd72ac380fe66d07725a88 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123766959)
   * 2829fa50fcfd02378f736b9f7ef9b52af6090cde : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123942470)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-12905) Convert CatalogView to org.apache.calcite.schema.Table so that planner can use unified catalog APIs

2019-08-20 Thread Bowen Li (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-12905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911623#comment-16911623
 ] 

Bowen Li commented on FLINK-12905:
--

Hi @dawidwys , can we move this effort forward? We need an end-to-end solution 
for views in 1.10, especially for blink planner. Thanks!

cc [~twalthr] [~xuefuz]

> Convert CatalogView to org.apache.calcite.schema.Table so that planner can 
> use unified catalog APIs
> ---
>
> Key: FLINK-12905
> URL: https://issues.apache.org/jira/browse/FLINK-12905
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / Legacy Planner, Table SQL / Planner
>Reporter: Dawid Wysakowicz
>Assignee: Dawid Wysakowicz
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Similar to [FLINK-12257] we should convert Flink's views to Calcite's views.
> The tricky part is that we have to pass around the SqlParser somehow.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (FLINK-10231) Add a view SQL DDL

2019-08-20 Thread Bowen Li (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-10231?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911615#comment-16911615
 ] 

Bowen Li commented on FLINK-10231:
--

Hi [~danny0405], can you lead this effort if wenhui doesn't have time for it? 
Thanks!

cc [~ykt836] [~xuefuz]

> Add a view SQL DDL
> --
>
> Key: FLINK-10231
> URL: https://issues.apache.org/jira/browse/FLINK-10231
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: Timo Walther
>Assignee: TANG Wen-hui
>Priority: Critical
> Fix For: 1.10.0
>
>
> FLINK-10163 added initial view support for the SQL Client. However, for 
> supporting the [full definition of 
> views|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/AlterView]
>  (with schema, comments, etc.) we need to support native support for views in 
> the Table API.
> {code}
> CREATE VIEW [IF NOT EXISTS] [catalog_name.db_name.]view_name [COMMENT 
> comment] AS SELECT ;
> DROP VIEW[IF EXISTS] [catalog_name.db_name.]view_name;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name AS SELECT ;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name RENAME TO ;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name SET COMMENT =  ;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-10231) Add a view SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-10231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-10231:
-
Description: 
FLINK-10163 added initial view support for the SQL Client. However, for 
supporting the [full definition of 
views|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/AlterView]
 (with schema, comments, etc.) we need to support native support for views in 
the Table API.


{code}
CREATE VIEW [IF NOT EXISTS] [catalog_name.db_name.]view_name [COMMENT comment] 
AS SELECT ;
DROP VIEW[IF EXISTS] [catalog_name.db_name.]view_name;
ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name AS SELECT ;
ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name RENAME TO ;
ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name SET COMMENT =  ;
{code}

  was:FLINK-10163 added initial view support for the SQL Client. However, for 
supporting the [full definition of 
views|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/AlterView]
 (with schema, comments, etc.) we need to support native support for views in 
the Table API.


> Add a view SQL DDL
> --
>
> Key: FLINK-10231
> URL: https://issues.apache.org/jira/browse/FLINK-10231
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: Timo Walther
>Assignee: TANG Wen-hui
>Priority: Critical
> Fix For: 1.10.0
>
>
> FLINK-10163 added initial view support for the SQL Client. However, for 
> supporting the [full definition of 
> views|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/AlterView]
>  (with schema, comments, etc.) we need to support native support for views in 
> the Table API.
> {code}
> CREATE VIEW [IF NOT EXISTS] [catalog_name.db_name.]view_name [COMMENT 
> comment] AS SELECT ;
> DROP VIEW[IF EXISTS] [catalog_name.db_name.]view_name;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name AS SELECT ;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name RENAME TO ;
> ALTER VIEW [IF EXISTS] [catalog_name.db_name.]view_name SET COMMENT =  ;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Description: 
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE FUNCTION [IF NOT EXISTS] [catalog_name.db_name.]function_name AS 
class_name;
DROP FUNCTION [IF EXISTS] [catalog_name.db_name.]function_name;
ALTER FUNCTION [IF EXISTS] [catalog_name.db_name.]function_name RENAME TO 
new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}

  was:
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE FUNCTION [IF NOT EXISTS] [catalog_name.db_name.]function_name AS 
class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}


> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Critical
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE FUNCTION [IF NOT EXISTS] [catalog_name.db_name.]function_name AS 
> class_name;
> DROP FUNCTION [IF EXISTS] [catalog_name.db_name.]function_name;
> ALTER FUNCTION [IF EXISTS] [catalog_name.db_name.]function_name RENAME TO 
> new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Description: 
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE FUNCTION [IF NOT EXISTS] [catalog_name.db_name.]function_name AS 
class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}

  was:
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE FUNCTION [catalog_name.db_name.]function_name AS class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}


> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Critical
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE FUNCTION [IF NOT EXISTS] [catalog_name.db_name.]function_name AS 
> class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Description: 
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE FUNCTION [catalog_name.db_name.]function_name AS class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}

  was:
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE function [catalog_name.db_name.]function_name AS class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}


> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Critical
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE FUNCTION [catalog_name.db_name.]function_name AS class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-10231) Add a view SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-10231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-10231:
-
Priority: Critical  (was: Major)

> Add a view SQL DDL
> --
>
> Key: FLINK-10231
> URL: https://issues.apache.org/jira/browse/FLINK-10231
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: Timo Walther
>Assignee: TANG Wen-hui
>Priority: Critical
> Fix For: 1.10.0
>
>
> FLINK-10163 added initial view support for the SQL Client. However, for 
> supporting the [full definition of 
> views|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/AlterView]
>  (with schema, comments, etc.) we need to support native support for views in 
> the Table API.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Priority: Critical  (was: Major)

> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Critical
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE function [catalog_name.db_name.]function_name AS class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Fix Version/s: 1.10.0

> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Major
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE function [catalog_name.db_name.]function_name AS class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911607#comment-16911607
 ] 

Bowen Li commented on FLINK-7151:
-

Hi [~danny0405], can you lead this effort if Shuyi doesn't have time for it? 
Thanks!

cc [~ykt836] [~xuefuz]

> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Critical
> Fix For: 1.10.0
>
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE function [catalog_name.db_name.]function_name AS class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9454: [FLINK-13644][docs-zh] Translate "State Backends" page into Chinese

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9454: [FLINK-13644][docs-zh] Translate 
"State Backends" page into Chinese
URL: https://github.com/apache/flink/pull/9454#issuecomment-521630829
 
 
   ## CI report:
   
   * a68ae4c218cbb02c7e2c3e1422ac89909d7e05bb : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/123366700)
   * c0a6452df2199975203a9dd1fe1f645af1562c06 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123937629)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (FLINK-7151) Add a function SQL DDL

2019-08-20 Thread Bowen Li (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bowen Li updated FLINK-7151:

Description: 
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:

{code}
CREATE function [catalog_name.db_name.]function_name AS class_name;
DROP FUNCTION [IF EXISTS] function_name;
ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
{code}


{code}
CREATE function 'TOPK' AS 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}

  was:
Based on create temporary function and table.we can register a udf,udaf,udtf 
use sql:
{code}
CREATE TEMPORARY function 'TOPK' AS 
'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
BY id;
{code}


> Add a function SQL DDL
> --
>
> Key: FLINK-7151
> URL: https://issues.apache.org/jira/browse/FLINK-7151
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / API
>Reporter: yuemeng
>Assignee: Shuyi Chen
>Priority: Major
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE function [catalog_name.db_name.]function_name AS class_name;
> DROP FUNCTION [IF EXISTS] function_name;
> ALTER FUNCTION [IF EXISTS] function_name RENAME TO new_name;
> {code}
> {code}
> CREATE function 'TOPK' AS 
> 'com..aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] flinkbot edited a comment on issue #9495: [FLINK-13798][task] Refactor the process of checking stream status while emitting watermark in source

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9495: [FLINK-13798][task] Refactor the 
process of checking stream status while emitting watermark in source
URL: https://github.com/apache/flink/pull/9495#issuecomment-523098969
 
 
   ## CI report:
   
   * 62cf805e72729b77857531f2053f58c4d9c10666 : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123927636)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (FLINK-13805) Bad Error Message when TaskManager is lost

2019-08-20 Thread Stephan Ewen (Jira)
Stephan Ewen created FLINK-13805:


 Summary: Bad Error Message when TaskManager is lost
 Key: FLINK-13805
 URL: https://issues.apache.org/jira/browse/FLINK-13805
 Project: Flink
  Issue Type: Bug
  Components: Runtime / Coordination
Affects Versions: 1.9.0
Reporter: Stephan Ewen
 Fix For: 1.10.0, 1.9.1


When a TaskManager is lost, the job reports as the failure cause
{code}
org.apache.flink.util.FlinkException: The assigned slot 
6d0e469d55a2630871f43ad0f89c786c_0 was removed.
{code}

That is a pretty bad error message, as a user I don't know what that means. 
Sounds like it could simply refer to internal book keeping, maybe some 
rebalancing or so.
You need to know a lot about Flink to understand that this means actually 
"TaskManager failure".




--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [flink] wuchong commented on issue #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
wuchong commented on issue #9230: [FLINK-13430][build] Configure sending travis 
build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#issuecomment-523099538
 
 
   Hi @zentol , I have setup a webhook to receive travis' notifications and 
generate a email and send to mailing list. 
   
   All the things looks good (send to my own email address), I will test on the 
mailing list then. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot edited a comment on issue #9230: [FLINK-13430][build] Configure sending travis build notifications to bui...@flink.apache.org

2019-08-20 Thread GitBox
flinkbot edited a comment on issue #9230: [FLINK-13430][build] Configure 
sending travis build notifications to bui...@flink.apache.org
URL: https://github.com/apache/flink/pull/9230#issuecomment-515092299
 
 
   ## CI report:
   
   * 465011b7a8f0ae4774f8c9c47ec03e8f58edaa2b : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120729445)
   * 117a85b4b34bc8e8032833cb33a8c29b025847bb : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/120847198)
   * 038bf5d0846020182ad14a01ae2491998235ac2d : SUCCESS 
[Build](https://travis-ci.com/flink-ci/flink/builds/122549760)
   * 5a58c51af7afb542887b2900fc35c8e82492a41c : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123316437)
   * 4c78488f133081281a4c5df362b358cdb80002fa : FAILURE 
[Build](https://travis-ci.com/flink-ci/flink/builds/123927663)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] flinkbot commented on issue #9495: [FLINK-13798][task] Refactor the process of checking stream status while emitting watermark in source

2019-08-20 Thread GitBox
flinkbot commented on issue #9495: [FLINK-13798][task] Refactor the process of 
checking stream status while emitting watermark in source
URL: https://github.com/apache/flink/pull/9495#issuecomment-523098969
 
 
   ## CI report:
   
   * 62cf805e72729b77857531f2053f58c4d9c10666 : PENDING 
[Build](https://travis-ci.com/flink-ci/flink/builds/123927636)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779216
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315736409
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315731643
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779688
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779550
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779896
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315734913
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315734830
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779928
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779622
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315733101
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,897 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
+
+DataStream transactions = env
+.addSource(new TransactionSource())
+.name("transactions");
+
+

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315734988
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315739004
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,897 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
+
+DataStream transactions = env
+.addSource(new TransactionSource())
+.name("transactions");
+
+

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779309
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315774824
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315726277
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779147
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779090
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315733418
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315734307
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779717
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779039
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315776102
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779418
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315737043
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315729123
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779589
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315779264
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

[GitHub] [flink] NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting Started - DataStream Example Walkthrough

2019-08-20 Thread GitBox
NicoK commented on a change in pull request #9210: [FLINK-12746][docs] Getting 
Started - DataStream Example Walkthrough
URL: https://github.com/apache/flink/pull/9210#discussion_r315778858
 
 

 ##
 File path: docs/getting-started/walkthroughs/datastream_api.md
 ##
 @@ -0,0 +1,923 @@
+---
+title: "DataStream API"
+nav-id: datastreamwalkthrough
+nav-title: 'DataStream API'
+nav-parent_id: walkthroughs
+nav-pos: 2
+---
+
+
+Apache Flink offers a DataStream API for building robust, stateful streaming 
applications.
+It provides fine-grained control over state and time, which allows for the 
implementation of complex event-driven systems.
+
+* This will be replaced by the TOC
+{:toc}
+
+## What Are You Building? 
+
+Credit card fraud is a growing concern in the digital age.
+Criminals steal credit card numbers by running scams or hacking into insecure 
systems.
+Stolen numbers are tested by making one or more small purchases, often for a 
dollar or less.
+If that works, they then make more significant purchases to get items they can 
sell or keep for themselves.
+
+In this tutorial, you will build a fraud detection system for alerting on 
suspicious credit card transactions.
+Using a simple set of rules, you will see how Flink allows us to implement 
advanced business logic and act in real-time.
+
+## Prerequisites
+
+This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you are coming from a different 
programming language.
+
+## Help, I’m Stuck! 
+
+If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
+In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) is consistently 
ranked as one of the most active of any Apache project and a great way to get 
help quickly.
+
+## How To Follow Along
+
+If you want to follow along, you will require a computer with:
+
+* Java 8 
+* Maven 
+
+A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
+
+{% panel **Note:** Each code block within this walkthrough may not contain the 
full surrounding class for brevity. The full code is available [at the bottom 
of the page](#final-application). %}
+
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-java \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+{% highlight bash %}
+$ mvn archetype:generate \
+-DarchetypeGroupId=org.apache.flink \
+-DarchetypeArtifactId=flink-walkthrough-datastream-scala \{% unless 
site.is_stable %}
+
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 \{% endunless %}
+-DarchetypeVersion={{ site.version }} \
+-DgroupId=frauddetection \
+-DartifactId=frauddetection \
+-Dversion=0.1 \
+-Dpackage=spendreport \
+-DinteractiveMode=false
+{% endhighlight %}
+
+
+
+{% unless site.is_stable %}
+
+Note: For Maven 3.0 or higher, it is no longer possible to specify 
the repository (-DarchetypeCatalog) via the commandline. If you wish to use the 
snapshot repository, you need to add a repository entry to your settings.xml. 
For details about this change, please refer to http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html;>Maven
 official document
+
+{% endunless %}
+
+You can edit the `groupId`, `artifactId` and `package` if you like. With the 
above parameters,
+Maven will create a project with all the dependencies to complete this 
tutorial.
+After importing the project into your editor, you will see a file with the 
following code which you can run directly inside your IDE.
+
+
+
+ FraudDetectionJob.java
+
+{% highlight java %}
+package frauddetection;
+
+import org.apache.flink.api.common.state.ValueState;
+import org.apache.flink.api.common.state.ValueStateDescriptor;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
+import org.apache.flink.streaming.api.functions.sink.PrintSinkFunction;
+import org.apache.flink.util.Collector;
+import org.apache.flink.walkthrough.common.entity.Alert;
+import org.apache.flink.walkthrough.common.entity.Transaction;
+import org.apache.flink.walkthrough.common.source.TransactionSource;
+
+public class FraudDetectionJob {
+
+public static void main(String[] args) throws Exception {
+StreamExecutionEnvironment env = 

  1   2   3   >