[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16561008#comment-16561008
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205960985
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   Cool, it works now. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205960985
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   Cool, it works now. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9943) Support TaskManagerMetricQueryServicePaths msg in JobManager Actor

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560997#comment-16560997
 ] 

ASF GitHub Bot commented on FLINK-9943:
---

chuanlei commented on issue #6429: [FLINK-9943] Support 
TaskManagerMetricQueryServicePaths msg in JobManager Actor
URL: https://github.com/apache/flink/pull/6429#issuecomment-408650270
 
 
   @twalthr could you please have a look for this pr?
   
   we think flink should expose its metrics directly via jobmanager so that we 
can develop 3rd party ui.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support TaskManagerMetricQueryServicePaths msg in JobManager Actor
> --
>
> Key: FLINK-9943
> URL: https://issues.apache.org/jira/browse/FLINK-9943
> Project: Flink
>  Issue Type: New Feature
>  Components: Core
>Affects Versions: 1.5.0, 1.5.1
>Reporter: Chuanlei Ni
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.6.0
>
>
> The reasons are as follows
>  #  AkkaJobManagerGateway wraps jm actor ref to support such functionality by 
> request RegisteredTaskManagers firstly and request task manager actor to get 
> metric query service path one by one. the procedure above is resource-wasted. 
> It will be more efficient if  we support this functionality in the jm actor
>  # we can expose flink metric system directly to external system (such as 
> flink client and the like) to support more features in future. For now, 
> metric system has been exposed partially because Instance can not (and should 
> not) be transfered remotely. This feature will make metrics exposure 
> consistent.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] chuanlei commented on issue #6429: [FLINK-9943] Support TaskManagerMetricQueryServicePaths msg in JobManager Actor

2018-07-28 Thread GitBox
chuanlei commented on issue #6429: [FLINK-9943] Support 
TaskManagerMetricQueryServicePaths msg in JobManager Actor
URL: https://github.com/apache/flink/pull/6429#issuecomment-408650270
 
 
   @twalthr could you please have a look for this pr?
   
   we think flink should expose its metrics directly via jobmanager so that we 
can develop 3rd party ui.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9988) job manager does not respect property jobmanager.web.address

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9988?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560996#comment-16560996
 ] 

ASF GitHub Bot commented on FLINK-9988:
---

yanghua commented on issue #6441: [FLINK-9988][rest] Add deprecated keys for 
server bind address
URL: https://github.com/apache/flink/pull/6441#issuecomment-408650246
 
 
   +1


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


>   job manager does not respect property jobmanager.web.address
> --
>
> Key: FLINK-9988
> URL: https://issues.apache.org/jira/browse/FLINK-9988
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Pavlo Petrychenko
>Assignee: Chesnay Schepler
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.5.3, 1.6.0
>
>
> As flink does not have any built in authentication mechanism, we used to 
> setup nginx in front of it and start jobmanager on 127.0.0.1.
> but starting from version 1.5.0 - it does not work anymore.
> distespecting on jobmanager.web.address it always start on 0.0.0.0



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yanghua commented on issue #6441: [FLINK-9988][rest] Add deprecated keys for server bind address

2018-07-28 Thread GitBox
yanghua commented on issue #6441: [FLINK-9988][rest] Add deprecated keys for 
server bind address
URL: https://github.com/apache/flink/pull/6441#issuecomment-408650246
 
 
   +1


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9993) Wordcount end-to-end test in docker failed

2018-07-28 Thread vinoyang (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560995#comment-16560995
 ] 

vinoyang commented on FLINK-9993:
-

cc [~till.rohrmann]

> Wordcount end-to-end test in docker failed
> --
>
> Key: FLINK-9993
> URL: https://issues.apache.org/jira/browse/FLINK-9993
> Project: Flink
>  Issue Type: Bug
>Reporter: vinoyang
>Priority: Critical
>
> Log : [https://api.travis-ci.org/v3/job/409208942/log.txt]
> {code:java}
> cd134db5e982: Pull complete 
> Digest: 
> sha256:6a8cbe4335d1a5711a52912b684e30d6dbfab681a6733440ff7241b05a5deefd
> Status: Downloaded newer image for java:8-jre-alpine
>  ---> fdc893b19a14
> Step 2/16 : RUN apk add --no-cache bash snappy
>  ---> [Warning] IPv4 forwarding is disabled. Networking will not work.
>  ---> Running in d7767c00401c
> fetch http://dl-cdn.alpinelinux.org/alpine/v3.4/main/x86_64/APKINDEX.tar.gz
> WARNING: Ignoring 
> http://dl-cdn.alpinelinux.org/alpine/v3.4/main/x86_64/APKINDEX.tar.gz: 
> temporary error (try again later)
> fetch 
> http://dl-cdn.alpinelinux.org/alpine/v3.4/community/x86_64/APKINDEX.tar.gz
> WARNING: Ignoring 
> http://dl-cdn.alpinelinux.org/alpine/v3.4/community/x86_64/APKINDEX.tar.gz: 
> temporary error (try again later)
> ERROR: unsatisfiable constraints:
>   bash (missing):
> required by: world[bash]
>   snappy (missing):
> required by: world[snappy]
> The command '/bin/sh -c apk add --no-cache bash snappy' returned a non-zero 
> code: 2
> sort: cannot read: 
> /home/travis/build/apache/flink/flink-end-to-end-tests/test-scripts/temp-test-directory-48913516789/out/docker_wc_out*:
>  No such file or directory
> FAIL WordCount: Output hash mismatch.  Got d41d8cd98f00b204e9800998ecf8427e, 
> expected 72a690412be8928ba239c2da967328a5.
> head hexdump of actual:
> head: cannot open 
> ‘/home/travis/build/apache/flink/flink-end-to-end-tests/test-scripts/temp-test-directory-48913516789/out/docker_wc_out*’
>  for reading: No such file or directory
> grep: 
> /home/travis/build/apache/flink/flink-dist/target/flink-1.7-SNAPSHOT-bin/flink-1.7-SNAPSHOT/log/*.out:
>  No such file or directory
> [FAIL] 'Wordcount end-to-end test in docker env' failed after 0 minutes and 
> 35 seconds! Test exited with exit code 1
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-9993) Wordcount end-to-end test in docker failed

2018-07-28 Thread vinoyang (JIRA)
vinoyang created FLINK-9993:
---

 Summary: Wordcount end-to-end test in docker failed
 Key: FLINK-9993
 URL: https://issues.apache.org/jira/browse/FLINK-9993
 Project: Flink
  Issue Type: Bug
Reporter: vinoyang


Log : [https://api.travis-ci.org/v3/job/409208942/log.txt]
{code:java}
cd134db5e982: Pull complete 
Digest: 
sha256:6a8cbe4335d1a5711a52912b684e30d6dbfab681a6733440ff7241b05a5deefd
Status: Downloaded newer image for java:8-jre-alpine
 ---> fdc893b19a14
Step 2/16 : RUN apk add --no-cache bash snappy
 ---> [Warning] IPv4 forwarding is disabled. Networking will not work.
 ---> Running in d7767c00401c
fetch http://dl-cdn.alpinelinux.org/alpine/v3.4/main/x86_64/APKINDEX.tar.gz
WARNING: Ignoring 
http://dl-cdn.alpinelinux.org/alpine/v3.4/main/x86_64/APKINDEX.tar.gz: 
temporary error (try again later)
fetch http://dl-cdn.alpinelinux.org/alpine/v3.4/community/x86_64/APKINDEX.tar.gz
WARNING: Ignoring 
http://dl-cdn.alpinelinux.org/alpine/v3.4/community/x86_64/APKINDEX.tar.gz: 
temporary error (try again later)
ERROR: unsatisfiable constraints:
  bash (missing):
required by: world[bash]
  snappy (missing):
required by: world[snappy]
The command '/bin/sh -c apk add --no-cache bash snappy' returned a non-zero 
code: 2
sort: cannot read: 
/home/travis/build/apache/flink/flink-end-to-end-tests/test-scripts/temp-test-directory-48913516789/out/docker_wc_out*:
 No such file or directory
FAIL WordCount: Output hash mismatch.  Got d41d8cd98f00b204e9800998ecf8427e, 
expected 72a690412be8928ba239c2da967328a5.
head hexdump of actual:
head: cannot open 
‘/home/travis/build/apache/flink/flink-end-to-end-tests/test-scripts/temp-test-directory-48913516789/out/docker_wc_out*’
 for reading: No such file or directory
grep: 
/home/travis/build/apache/flink/flink-dist/target/flink-1.7-SNAPSHOT-bin/flink-1.7-SNAPSHOT/log/*.out:
 No such file or directory

[FAIL] 'Wordcount end-to-end test in docker env' failed after 0 minutes and 35 
seconds! Test exited with exit code 1

{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (FLINK-9984) Add a byte array table format factory

2018-07-28 Thread Ruidong Li (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9984?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ruidong Li reassigned FLINK-9984:
-

Assignee: Ruidong Li

> Add a byte array table format factory
> -
>
> Key: FLINK-9984
> URL: https://issues.apache.org/jira/browse/FLINK-9984
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Timo Walther
>Assignee: Ruidong Li
>Priority: Major
>
> Sometimes it might be useful to just read or write a plain byte array into 
> Kafka or other connectors. We should add a simple byte array 
> SerializationSchemaFactory and DeserializationSchemaFactory.
> See {{org.apache.flink.formats.json.JsonRowFormatFactory}} for a reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9963) Add a string table format factory

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560994#comment-16560994
 ] 

ASF GitHub Bot commented on FLINK-9963:
---

Xpray commented on issue #6447: [FLINK-9963][TableAPI & SQL] Add a string table 
format factory
URL: https://github.com/apache/flink/pull/6447#issuecomment-408649986
 
 
   @twalthr, It seems that this pr can be more generic to 
`SingleColumnRowFormat`, so FLINK-9984 can be a concrete implementation.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add a string table format factory
> -
>
> Key: FLINK-9963
> URL: https://issues.apache.org/jira/browse/FLINK-9963
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Timo Walther
>Assignee: Ruidong Li
>Priority: Major
>  Labels: pull-request-available
>
> Sometimes it might be useful to just read or write a string into Kafka or 
> other connectors. We should add a simple string 
> {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we 
> want to represent all data types and nested types is still up for discussion. 
> We could also support just a single string field?
> Schema derivation should be supported by the factories.
> See {{org.apache.flink.formats.json.JsonRowFormatFactory}} for a reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Xpray commented on issue #6447: [FLINK-9963][TableAPI & SQL] Add a string table format factory

2018-07-28 Thread GitBox
Xpray commented on issue #6447: [FLINK-9963][TableAPI & SQL] Add a string table 
format factory
URL: https://github.com/apache/flink/pull/6447#issuecomment-408649986
 
 
   @twalthr, It seems that this pr can be more generic to 
`SingleColumnRowFormat`, so FLINK-9984 can be a concrete implementation.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9958) Fix potential NPE for delta iteration of DataSet

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560993#comment-16560993
 ] 

ASF GitHub Bot commented on FLINK-9958:
---

Xpray commented on issue #6426: [FLINK-9958][DataSet] Fix potential NPE for 
delta iteration of DataSet
URL: https://github.com/apache/flink/pull/6426#issuecomment-408649629
 
 
   hi @yanghua  thanks for you advice. I've updated the PR.
   Best


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Fix potential NPE for delta iteration of DataSet
> 
>
> Key: FLINK-9958
> URL: https://issues.apache.org/jira/browse/FLINK-9958
> Project: Flink
>  Issue Type: Bug
>  Components: DataSet API
>Reporter: Ruidong Li
>Assignee: Ruidong Li
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560992#comment-16560992
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959875
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's true. The code before latest commit, will also fail for `TINYINT `, 
`SMALLINT `. The latest commit can pass these test cases.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Xpray commented on issue #6426: [FLINK-9958][DataSet] Fix potential NPE for delta iteration of DataSet

2018-07-28 Thread GitBox
Xpray commented on issue #6426: [FLINK-9958][DataSet] Fix potential NPE for 
delta iteration of DataSet
URL: https://github.com/apache/flink/pull/6426#issuecomment-408649629
 
 
   hi @yanghua  thanks for you advice. I've updated the PR.
   Best


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959875
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's true. The code before latest commit, will also fail for `TINYINT `, 
`SMALLINT `. The latest commit can pass these test cases.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560991#comment-16560991
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959747
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   Never mind. You can update the pr with the code which you think it is right 
and I will take another look.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959747
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   Never mind. You can update the pr with the code which you think it is right 
and I will take another look.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560990#comment-16560990
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959687
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   I tested with `TINYINT `, `SMALLINT` and `BIGINT` but all failed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959687
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   I tested with `TINYINT `, `SMALLINT` and `BIGINT` but all failed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560989#comment-16560989
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959654
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   yes, I said I changed `Integer` to `Long` in my local env, the test is OK. I 
did not update this PR. I just discuss with you whether solution is OK or not. 
The code you tested will failed because of `SMALLINT ` or `TINYINT `?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959654
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   yes, I said I changed `Integer` to `Long` in my local env, the test is OK. I 
did not update this PR. I just discuss with you whether solution is OK or not. 
The code you tested will failed because of `SMALLINT ` or `TINYINT `?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560988#comment-16560988
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch just now and find the exception 
still be thrown. Could you add the tests into the PR? Thanks a lot.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch just now and find the exception 
still be thrown. Could you add the tests into the PR? Thanks a lot.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560987#comment-16560987
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch just now and find the exception 
still be thrown.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560986#comment-16560986
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch and find the exception still will 
be thrown.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch just now and find the exception 
still be thrown.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205959590
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   It's weird. I checkout your latest branch and find the exception still will 
be thrown.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (FLINK-9982) NPE in EnumValueSerializer#copy

2018-07-28 Thread zhangminglei (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhangminglei reassigned FLINK-9982:
---

Assignee: zhangminglei  (was: dalongliu)

> NPE in EnumValueSerializer#copy
> ---
>
> Key: FLINK-9982
> URL: https://issues.apache.org/jira/browse/FLINK-9982
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.3.3, 1.4.1, 1.4.2, 1.5.1
>Reporter: zhangminglei
>Assignee: zhangminglei
>Priority: Major
>
> When execute the flink job in flink 1.3.2 version. We met the below error.
> {code:java}
> java.lang.NullPointerException
>   at 
> org.apache.flink.api.scala.typeutils.EnumValueSerializer.copy(EnumValueSerializer.scala:50)
>   at 
> org.apache.flink.api.scala.typeutils.EnumValueSerializer.copy(EnumValueSerializer.scala:36)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
>   at 
> org.apache.flink.api.common.typeutils.base.GenericArraySerializer.copy(GenericArraySerializer.java:92)
>   at 
> org.apache.flink.api.common.typeutils.base.GenericArraySerializer.copy(GenericArraySerializer.java:42)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
>   at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:526)
>   at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:503)
>   at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:483)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:891)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:869)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:304)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:393)
>   at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecord(AbstractFetcher.java:233)
>   at 
> org.apache.flink.streaming.connectors.kafka.internals.SimpleConsumerThread.run(SimpleConsumerThread.java:385)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (FLINK-9963) Add a string table format factory

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9963?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-9963:
--
Labels: pull-request-available  (was: )

> Add a string table format factory
> -
>
> Key: FLINK-9963
> URL: https://issues.apache.org/jira/browse/FLINK-9963
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Timo Walther
>Assignee: Ruidong Li
>Priority: Major
>  Labels: pull-request-available
>
> Sometimes it might be useful to just read or write a string into Kafka or 
> other connectors. We should add a simple string 
> {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we 
> want to represent all data types and nested types is still up for discussion. 
> We could also support just a single string field?
> Schema derivation should be supported by the factories.
> See {{org.apache.flink.formats.json.JsonRowFormatFactory}} for a reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9963) Add a string table format factory

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560982#comment-16560982
 ] 

ASF GitHub Bot commented on FLINK-9963:
---

Xpray opened a new pull request #6447: [FLINK-9963][TableAPI & SQL] Add a 
string table format factory
URL: https://github.com/apache/flink/pull/6447
 
 
   
   ## What is the purpose of the change
   
   Add a string table format factory
   
   
   ## Brief change log
 - *add StringRowFormatFactory*
   
   
   ## Verifying this change
   
   
   This change added tests and can be verified as follows:
   test cases in flink-formats/flink-string
   
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency):  no
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`:  no
 - The serializers:  no
 - The runtime per-record code paths (performance sensitive):  no
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: no
 - The S3 file system connector:  no
   ## Documentation
   
 - Does this pull request introduce a new feature?  no
 - If yes, how is the feature documented?  no
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add a string table format factory
> -
>
> Key: FLINK-9963
> URL: https://issues.apache.org/jira/browse/FLINK-9963
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Timo Walther
>Assignee: Ruidong Li
>Priority: Major
>  Labels: pull-request-available
>
> Sometimes it might be useful to just read or write a string into Kafka or 
> other connectors. We should add a simple string 
> {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we 
> want to represent all data types and nested types is still up for discussion. 
> We could also support just a single string field?
> Schema derivation should be supported by the factories.
> See {{org.apache.flink.formats.json.JsonRowFormatFactory}} for a reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (FLINK-9985) Incorrect parameter order in document

2018-07-28 Thread zhangminglei (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhangminglei reassigned FLINK-9985:
---

Assignee: zhangminglei  (was: dalongliu)

> Incorrect parameter order in document
> -
>
> Key: FLINK-9985
> URL: https://issues.apache.org/jira/browse/FLINK-9985
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 1.5.1
>Reporter: zhangminglei
>Assignee: zhangminglei
>Priority: Major
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.5/dev/stream/operators/windows.html#incremental-window-aggregation-with-foldfunction
> {code:java}
> public Tuple3 fold(Tuple3 acc, 
> SensorReading s) {
>   Integer cur = acc.getField(2);
>   acc.setField(2, cur + 1); // incorrect parameter order , it should be 
> acc.setField(cur + 1, 2)
>   return acc;
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Xpray opened a new pull request #6447: [FLINK-9963][TableAPI & SQL] Add a string table format factory

2018-07-28 Thread GitBox
Xpray opened a new pull request #6447: [FLINK-9963][TableAPI & SQL] Add a 
string table format factory
URL: https://github.com/apache/flink/pull/6447
 
 
   
   ## What is the purpose of the change
   
   Add a string table format factory
   
   
   ## Brief change log
 - *add StringRowFormatFactory*
   
   
   ## Verifying this change
   
   
   This change added tests and can be verified as follows:
   test cases in flink-formats/flink-string
   
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency):  no
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`:  no
 - The serializers:  no
 - The runtime per-record code paths (performance sensitive):  no
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: no
 - The S3 file system connector:  no
   ## Documentation
   
 - Does this pull request introduce a new feature?  no
 - If yes, how is the feature documented?  no
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (FLINK-9992) FsStorageLocationReferenceTest#testEncodeAndDecode failed in Travis CI

2018-07-28 Thread vinoyang (JIRA)
vinoyang created FLINK-9992:
---

 Summary: FsStorageLocationReferenceTest#testEncodeAndDecode failed 
in Travis CI
 Key: FLINK-9992
 URL: https://issues.apache.org/jira/browse/FLINK-9992
 Project: Flink
  Issue Type: Bug
Reporter: vinoyang


{code:java}
testEncodeAndDecode(org.apache.flink.runtime.state.filesystem.FsStorageLocationReferenceTest)
  Time elapsed: 0.027 sec  <<< ERROR!
java.lang.IllegalArgumentException: java.net.URISyntaxException: Illegal 
character in hostname at index 5: 
gl://碪⯶㪴]ឪ嵿⎐䪀筪ᆶ歑ᆂ玚୓䇷ノⳡ೯43575/䡷ᦼ☶⨩䚩筶ࢊණ⣁᳊尯/彡䫼畒伈森削㔞/缳漸⩧勎㓘癐⍖ᾐ䘽㼺䨶/粉掩㤡⪌⎏㆐罠Ꮨㆆ䤱ൎ堉儾
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.parseHostname(URI.java:3387)
at java.net.URI$Parser.parseServer(URI.java:3236)
at java.net.URI$Parser.parseAuthority(URI.java:3155)
at java.net.URI$Parser.parseHierarchical(URI.java:3097)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.(URI.java:746)
at org.apache.flink.core.fs.Path.initialize(Path.java:247)
at org.apache.flink.core.fs.Path.(Path.java:217)
at 
org.apache.flink.runtime.state.filesystem.FsStorageLocationReferenceTest.randomPath(FsStorageLocationReferenceTest.java:88)
at 
org.apache.flink.runtime.state.filesystem.FsStorageLocationReferenceTest.testEncodeAndDecode(FsStorageLocationReferenceTest.java:41)
{code}
log is here : https://travis-ci.org/apache/flink/jobs/409430886

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-9991) Add regex_replace supported in TableAPI and SQL

2018-07-28 Thread vinoyang (JIRA)
vinoyang created FLINK-9991:
---

 Summary: Add regex_replace supported in TableAPI and SQL
 Key: FLINK-9991
 URL: https://issues.apache.org/jira/browse/FLINK-9991
 Project: Flink
  Issue Type: Sub-task
  Components: Table API  SQL
Reporter: vinoyang
Assignee: vinoyang


regexp_replace is a very userful function to process String. 
 For example :
{code:java}
regexp_replace("foobar", "oo|ar", "") //returns 'fb.'
{code}
It is supported as a UDF in Hive, more details please see[1].

[1]: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (FLINK-9990) Add regex_extract supported in TableAPI and SQL

2018-07-28 Thread vinoyang (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9990?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

vinoyang updated FLINK-9990:

Description: 
regex_extract is a very useful function, it returns a string based on a regex 
pattern and a index.

For example : 
{code:java}
regexp_extract('foothebar', 'foo(.*?)(bar)', 2) // returns 'bar.'
{code}

It is provided as a UDF in Hive, more details please see[1].

[1]: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF

  was:
regex_extract is a very useful function, it returns a string based on a regex 
pattern and a index.

For example : 
{code:java}
regexp_extract('foothebar', 'foo(.*?)(bar)', 2) // returns 'bar.'

{code}


> Add regex_extract supported in TableAPI and SQL
> ---
>
> Key: FLINK-9990
> URL: https://issues.apache.org/jira/browse/FLINK-9990
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>
> regex_extract is a very useful function, it returns a string based on a regex 
> pattern and a index.
> For example : 
> {code:java}
> regexp_extract('foothebar', 'foo(.*?)(bar)', 2) // returns 'bar.'
> {code}
> It is provided as a UDF in Hive, more details please see[1].
> [1]: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-9990) Add regex_extract supported in TableAPI and SQL

2018-07-28 Thread vinoyang (JIRA)
vinoyang created FLINK-9990:
---

 Summary: Add regex_extract supported in TableAPI and SQL
 Key: FLINK-9990
 URL: https://issues.apache.org/jira/browse/FLINK-9990
 Project: Flink
  Issue Type: Sub-task
  Components: Table API  SQL
Reporter: vinoyang
Assignee: vinoyang


regex_extract is a very useful function, it returns a string based on a regex 
pattern and a index.

For example : 
{code:java}
regexp_extract('foothebar', 'foo(.*?)(bar)', 2) // returns 'bar.'

{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560975#comment-16560975
 ] 

ASF GitHub Bot commented on FLINK-8302:
---

yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958595
 
 

 ##
 File path: docs/dev/table/tableApi.md
 ##
 @@ -3598,6 +3598,28 @@ numeric1 % numeric2
   
 
 
+
+  
+{% highlight scala %}
+shiftLeft(numeric1, numeric2)
 
 Review comment:
   add Java doc for shiftLeft


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560977#comment-16560977
 ] 

ASF GitHub Bot commented on FLINK-8302:
---

yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958632
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/api/scala/expressionDsl.scala
 ##
 @@ -831,6 +831,16 @@ trait ImplicitExpressionOperations {
 */
   def notBetween(lowerBound: Expression, upperBound: Expression) =
 NotBetween(expr, lowerBound, upperBound)
+
+  /*
+   * Left shift
 
 Review comment:
   provide more detail and end with "." looks better to me


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560976#comment-16560976
 ] 

ASF GitHub Bot commented on FLINK-8302:
---

yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958599
 
 

 ##
 File path: docs/dev/table/tableApi.md
 ##
 @@ -3598,6 +3598,28 @@ numeric1 % numeric2
   
 
 
+
+  
+{% highlight scala %}
+shiftLeft(numeric1, numeric2)
+{% endhighlight %}
+  
+  
+Returns numeric1 shifted left of numeric2. The result 
is numeric1 << numeric2 
+  
+
+
+
+  
+{% highlight scala %}
+shiftRight(numeric1, numeric2)
 
 Review comment:
   add Java doc for shiftRight


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560978#comment-16560978
 ] 

ASF GitHub Bot commented on FLINK-8302:
---

yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958635
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/api/scala/expressionDsl.scala
 ##
 @@ -831,6 +831,16 @@ trait ImplicitExpressionOperations {
 */
   def notBetween(lowerBound: Expression, upperBound: Expression) =
 NotBetween(expr, lowerBound, upperBound)
+
+  /*
+   * Left shift
+   */
+  def shiftLeft(right: Expression) = ShiftLeft(expr, right)
+
+  /*
+   * Right shift
 
 Review comment:
   provide more detail and end with "." looks better to me


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & SQL] Add SHIFT_LEFT and SHIFT_RIGHT

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958595
 
 

 ##
 File path: docs/dev/table/tableApi.md
 ##
 @@ -3598,6 +3598,28 @@ numeric1 % numeric2
   
 
 
+
+  
+{% highlight scala %}
+shiftLeft(numeric1, numeric2)
 
 Review comment:
   add Java doc for shiftLeft


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & SQL] Add SHIFT_LEFT and SHIFT_RIGHT

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958632
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/api/scala/expressionDsl.scala
 ##
 @@ -831,6 +831,16 @@ trait ImplicitExpressionOperations {
 */
   def notBetween(lowerBound: Expression, upperBound: Expression) =
 NotBetween(expr, lowerBound, upperBound)
+
+  /*
+   * Left shift
 
 Review comment:
   provide more detail and end with "." looks better to me


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & SQL] Add SHIFT_LEFT and SHIFT_RIGHT

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958635
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/api/scala/expressionDsl.scala
 ##
 @@ -831,6 +831,16 @@ trait ImplicitExpressionOperations {
 */
   def notBetween(lowerBound: Expression, upperBound: Expression) =
 NotBetween(expr, lowerBound, upperBound)
+
+  /*
+   * Left shift
+   */
+  def shiftLeft(right: Expression) = ShiftLeft(expr, right)
+
+  /*
+   * Right shift
 
 Review comment:
   provide more detail and end with "." looks better to me


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & SQL] Add SHIFT_LEFT and SHIFT_RIGHT

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6445: [FLINK-8302][Table API & 
SQL] Add SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445#discussion_r205958599
 
 

 ##
 File path: docs/dev/table/tableApi.md
 ##
 @@ -3598,6 +3598,28 @@ numeric1 % numeric2
   
 
 
+
+  
+{% highlight scala %}
+shiftLeft(numeric1, numeric2)
+{% endhighlight %}
+  
+  
+Returns numeric1 shifted left of numeric2. The result 
is numeric1 << numeric2 
+  
+
+
+
+  
+{% highlight scala %}
+shiftRight(numeric1, numeric2)
 
 Review comment:
   add Java doc for shiftRight


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Comment Edited] (FLINK-7588) Document RocksDB tuning for spinning disks

2018-07-28 Thread Ted Yu (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-7588?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16258309#comment-16258309
 ] 

Ted Yu edited comment on FLINK-7588 at 7/29/18 1:33 AM:


bq. Be careful about whether you have enough memory to keep all bloom filters

Other than the above being tricky, the other guidelines are actionable.


was (Author: yuzhih...@gmail.com):
bq. Be careful about whether you have enough memory to keep all bloom filters

Other than the above being tricky, the other guidelines are actionable .

> Document RocksDB tuning for spinning disks
> --
>
> Key: FLINK-7588
> URL: https://issues.apache.org/jira/browse/FLINK-7588
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Ted Yu
>Priority: Major
>  Labels: performance
>
> In docs/ops/state/large_state_tuning.md , it was mentioned that:
> bq. the default configuration is tailored towards SSDs and performs 
> suboptimal on spinning disks
> We should add recommendation targeting spinning disks:
> https://github.com/facebook/rocksdb/wiki/RocksDB-Tuning-Guide#difference-of-spinning-disk



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9877) Add separate docs page for different join types in DataStream API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560935#comment-16560935
 ] 

ASF GitHub Bot commented on FLINK-9877:
---

eliaslevy commented on issue #6407: [FLINK-9877][docs] Add documentation page 
for different datastream joins
URL: https://github.com/apache/flink/pull/6407#issuecomment-408643197
 
 
   The documentation needs to discuss how the watermarks are delayed if they 
are.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add separate docs page for different join types in DataStream API
> -
>
> Key: FLINK-9877
> URL: https://issues.apache.org/jira/browse/FLINK-9877
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Florian Schmidt
>Assignee: Florian Schmidt
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.6.0
>
>
> https://github.com/apache/flink/pull/6407



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] eliaslevy commented on issue #6407: [FLINK-9877][docs] Add documentation page for different datastream joins

2018-07-28 Thread GitBox
eliaslevy commented on issue #6407: [FLINK-9877][docs] Add documentation page 
for different datastream joins
URL: https://github.com/apache/flink/pull/6407#issuecomment-408643197
 
 
   The documentation needs to discuss how the watermarks are delayed if they 
are.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956054
 
 

 ##
 File path: 
flink-mesos/src/main/java/org/apache/flink/mesos/runtime/clusterframework/MesosApplicationMasterRunner.java
 ##
 @@ -374,11 +379,14 @@ protected int runPrivileged(Configuration config, 
Configuration dynamicPropertie
}
 
if (actorSystem != null) {
-   try {
-   actorSystem.shutdown();
-   } catch (Throwable tt) {
-   LOG.error("Error shutting down actor 
system", tt);
-   }
+   actorSystem.terminate().onComplete(
+   new OnComplete() {
+   public void 
onComplete(Throwable failure, Terminated result) {
+   if (failure != null) {
+   
LOG.error("Error shutting down actor system", failure);
+   }
+   }
+   }, 
org.apache.flink.runtime.concurrent.Executors.directExecutionContext());
 
 Review comment:
   also here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560923#comment-16560923
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956045
 
 

 ##
 File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/akka/DefaultQuarantineHandler.java
 ##
 @@ -65,11 +66,13 @@ public void hasQuarantined(String remoteSystem, 
ActorSystem actorSystem) {
 
private void shutdownActorSystem(ActorSystem actorSystem) {
// shut the actor system down
-   actorSystem.shutdown();
+   actorSystem.terminate();
 
try {
// give it some time to complete the shutdown
-   actorSystem.awaitTermination(timeout);
+   Await.ready(actorSystem.whenTerminated(), timeout);
+   } catch (InterruptedException | 
java.util.concurrent.TimeoutException e) {
 
 Review comment:
   `java.util.concurrent.TimeoutException` could be simplified.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560922#comment-16560922
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956009
 
 

 ##
 File path: 
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
 ##
 @@ -2288,11 +2291,10 @@ object JobManager {
 catch {
   case t: Throwable =>
 LOG.error("Error while starting up JobManager", t)
-try {
-  jobManagerSystem.shutdown()
-} catch {
-  case tt: Throwable => LOG.warn("Could not cleanly shut down actor 
system", tt)
-}
+jobManagerSystem.terminate().onComplete {
+  case scala.util.Success(_) =>
+  case scala.util.Failure(tt) => LOG.warn("Could not cleanly shut down 
actor system", tt)
+
}(org.apache.flink.runtime.concurrent.Executors.directExecutionContext())
 
 Review comment:
   also here could we `import static`, in scala I think it might be `import`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956013
 
 

 ##
 File path: 
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
 ##
 @@ -1867,7 +1867,10 @@ class JobManager(
   FiniteDuration(10, SECONDS)).start()
 
 // Shutdown and discard all queued messages
-context.system.shutdown()
+context.system.terminate().onComplete {
+  case scala.util.Success(_) =>
+  case scala.util.Failure(t) => log.warn("Could not cleanly shut down 
actor system", t)
+}(org.apache.flink.runtime.concurrent.Executors.directExecutionContext())
 
 Review comment:
   and here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205955993
 
 

 ##
 File path: 
flink-yarn/src/main/java/org/apache/flink/yarn/YarnApplicationMasterRunner.java
 ##
 @@ -401,11 +406,14 @@ protected int runApplicationMaster(Configuration config) 
{
}
 
if (actorSystem != null) {
-   try {
-   actorSystem.shutdown();
-   } catch (Throwable tt) {
-   LOG.error("Error shutting down actor 
system", tt);
-   }
+   actorSystem.terminate().onComplete(
+   new OnComplete() {
+   public void 
onComplete(Throwable failure, Terminated result) {
+   if (failure != null) {
+   
LOG.error("Error shutting down actor system", failure);
+   }
+   }
+   }, 
org.apache.flink.runtime.concurrent.Executors.directExecutionContext());
 
 Review comment:
   This PR looks good to me. Here are a little questions, could this be 
simplified by `import static`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956009
 
 

 ##
 File path: 
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
 ##
 @@ -2288,11 +2291,10 @@ object JobManager {
 catch {
   case t: Throwable =>
 LOG.error("Error while starting up JobManager", t)
-try {
-  jobManagerSystem.shutdown()
-} catch {
-  case tt: Throwable => LOG.warn("Could not cleanly shut down actor 
system", tt)
-}
+jobManagerSystem.terminate().onComplete {
+  case scala.util.Success(_) =>
+  case scala.util.Failure(tt) => LOG.warn("Could not cleanly shut down 
actor system", tt)
+
}(org.apache.flink.runtime.concurrent.Executors.directExecutionContext())
 
 Review comment:
   also here could we `import static`, in scala I think it might be `import`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560924#comment-16560924
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956054
 
 

 ##
 File path: 
flink-mesos/src/main/java/org/apache/flink/mesos/runtime/clusterframework/MesosApplicationMasterRunner.java
 ##
 @@ -374,11 +379,14 @@ protected int runPrivileged(Configuration config, 
Configuration dynamicPropertie
}
 
if (actorSystem != null) {
-   try {
-   actorSystem.shutdown();
-   } catch (Throwable tt) {
-   LOG.error("Error shutting down actor 
system", tt);
-   }
+   actorSystem.terminate().onComplete(
+   new OnComplete() {
+   public void 
onComplete(Throwable failure, Terminated result) {
+   if (failure != null) {
+   
LOG.error("Error shutting down actor system", failure);
+   }
+   }
+   }, 
org.apache.flink.runtime.concurrent.Executors.directExecutionContext());
 
 Review comment:
   also here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560921#comment-16560921
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956013
 
 

 ##
 File path: 
flink-runtime/src/main/scala/org/apache/flink/runtime/jobmanager/JobManager.scala
 ##
 @@ -1867,7 +1867,10 @@ class JobManager(
   FiniteDuration(10, SECONDS)).start()
 
 // Shutdown and discard all queued messages
-context.system.shutdown()
+context.system.terminate().onComplete {
+  case scala.util.Success(_) =>
+  case scala.util.Failure(t) => log.warn("Could not cleanly shut down 
actor system", t)
+}(org.apache.flink.runtime.concurrent.Executors.directExecutionContext())
 
 Review comment:
   and here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560920#comment-16560920
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205955993
 
 

 ##
 File path: 
flink-yarn/src/main/java/org/apache/flink/yarn/YarnApplicationMasterRunner.java
 ##
 @@ -401,11 +406,14 @@ protected int runApplicationMaster(Configuration config) 
{
}
 
if (actorSystem != null) {
-   try {
-   actorSystem.shutdown();
-   } catch (Throwable tt) {
-   LOG.error("Error shutting down actor 
system", tt);
-   }
+   actorSystem.terminate().onComplete(
+   new OnComplete() {
+   public void 
onComplete(Throwable failure, Terminated result) {
+   if (failure != null) {
+   
LOG.error("Error shutting down actor system", failure);
+   }
+   }
+   }, 
org.apache.flink.runtime.concurrent.Executors.directExecutionContext());
 
 Review comment:
   This PR looks good to me. Here are a little questions, could this be 
simplified by `import static`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
TisonKun commented on a change in pull request #6446: [FLINK-9240] Avoid 
deprecated Akka methods
URL: https://github.com/apache/flink/pull/6446#discussion_r205956045
 
 

 ##
 File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/akka/DefaultQuarantineHandler.java
 ##
 @@ -65,11 +66,13 @@ public void hasQuarantined(String remoteSystem, 
ActorSystem actorSystem) {
 
private void shutdownActorSystem(ActorSystem actorSystem) {
// shut the actor system down
-   actorSystem.shutdown();
+   actorSystem.terminate();
 
try {
// give it some time to complete the shutdown
-   actorSystem.awaitTermination(timeout);
+   Await.ready(actorSystem.whenTerminated(), timeout);
+   } catch (InterruptedException | 
java.util.concurrent.TimeoutException e) {
 
 Review comment:
   `java.util.concurrent.TimeoutException` could be simplified.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Comment Edited] (FLINK-5315) Support distinct aggregations in table api

2018-07-28 Thread Rong Rong (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-5315?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560803#comment-16560803
 ] 

Rong Rong edited comment on FLINK-5315 at 7/28/18 9:38 PM:
---

Hmm. Yes that's true we cannot have {{a.distinct}} by itself if {{a}} is column 
expression. But if {{a}} is a table, for example: {{select ( * ).distinct}}, 
then this is very much valid. (oops, (*) is not correct)

Yes but I agree, the best way to express this would've been {{ 
agg.distinct('arg1[, 'arg2]...) }} regardless whether this is an expression or 
function since it is a modifier for the method. I think we can definitely do 
that for function. I am ok with the {{'col.agg.distinct}} approach actually 
despite a little confusion. what do you guys think [~fhueske][~twalthr]?


was (Author: walterddr):
Hmm. Yes that's true we cannot have {{a.distinct}} by itself if {{a}} is column 
expression. But if {{a}} is a table, for example: {{select(*).distinct}}, then 
this is very much valid. 

Yes but I agree, the best way to express this would've been {{ 
agg.distinct('arg1[, 'arg2]...) }} regardless whether this is an expression or 
function since it is a modifier for the method. I think we can definitely do 
that for function. I am ok with the {{'col.agg.distinct}} approach actually 
despite a little confusion. what do you guys think [~fhueske][~twalthr]?

> Support distinct aggregations in table api
> --
>
> Key: FLINK-5315
> URL: https://issues.apache.org/jira/browse/FLINK-5315
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Kurt Young
>Assignee: Rong Rong
>Priority: Major
>
> Such as 
> {code}
> t.select("count(distinct a), sum(b)")
> {code}
> or 
> {code}
> t.select('a.count.distinct), 'b.sum)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560853#comment-16560853
 ] 

ASF GitHub Bot commented on FLINK-9240:
---

chunjef opened a new pull request #6446: [FLINK-9240] Avoid deprecated Akka 
methods
URL: https://github.com/apache/flink/pull/6446
 
 
   ## What is the purpose of the change
   
   This PR replaces the Akka methods that were deprecated in Akka 2.4 with 
their respective non-deprecated alternatives. Since [Akka 2.5 is backwards 
binary compatible with 
2.4](https://doc.akka.io/docs/akka/2.5/common/binary-compatibility-rules.html) 
(except the deprecated methods, which were removed in 2.5), this change would 
allow Flink users to use Akka 2.5 without resorting to dependency shading to 
avoid classpath conflicts with Flink's use of Akka 2.4.
   
   
   ## Brief change log
   
   - Replaced the use of `ActorSystem.awaitTermination()` with 
`Await.ready(system.whenTerminated, Duration.Inf)`
   - Replaced the use of `ActorSystem.awaitTermination(timeout)` with 
`Await.ready(system.whenTerminated, timeout)`
   - Replaced the use of `ActorSystem.isTerminated` with 
`ActorSystem.whenTerminated.isCompleted`
   - Replaced the use of `ActorSystem.shutdown()` with `ActorSystem.terminate()`
   
   
   ## Verifying this change
   
   This change is already covered by existing tests.
   
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): no
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: no
 - The serializers: no
 - The runtime per-record code paths (performance sensitive): no
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: yes
 - The S3 file system connector: no
   
   ## Documentation
   
 - Does this pull request introduce a new feature? no
 - If yes, how is the feature documented? not applicable
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (FLINK-9240) Avoid deprecated akka methods

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9240?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-9240:
--
Labels: pull-request-available  (was: )

> Avoid deprecated akka methods
> -
>
> Key: FLINK-9240
> URL: https://issues.apache.org/jira/browse/FLINK-9240
> Project: Flink
>  Issue Type: Improvement
>  Components: Client, Local Runtime, Mesos, Tests, Web Client, YARN
>Affects Versions: 1.6.0
>Reporter: Arnout Engelen
>Priority: Minor
>  Labels: pull-request-available
>
> Several Akka functions that are widely-used in Flink have been deprecated for 
> a long time, such as ActorSystem#shutdown() and 
> ActorSystem.awaitTermination(). It would be nice to update those to use 
> non-deprecated functions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] chunjef opened a new pull request #6446: [FLINK-9240] Avoid deprecated Akka methods

2018-07-28 Thread GitBox
chunjef opened a new pull request #6446: [FLINK-9240] Avoid deprecated Akka 
methods
URL: https://github.com/apache/flink/pull/6446
 
 
   ## What is the purpose of the change
   
   This PR replaces the Akka methods that were deprecated in Akka 2.4 with 
their respective non-deprecated alternatives. Since [Akka 2.5 is backwards 
binary compatible with 
2.4](https://doc.akka.io/docs/akka/2.5/common/binary-compatibility-rules.html) 
(except the deprecated methods, which were removed in 2.5), this change would 
allow Flink users to use Akka 2.5 without resorting to dependency shading to 
avoid classpath conflicts with Flink's use of Akka 2.4.
   
   
   ## Brief change log
   
   - Replaced the use of `ActorSystem.awaitTermination()` with 
`Await.ready(system.whenTerminated, Duration.Inf)`
   - Replaced the use of `ActorSystem.awaitTermination(timeout)` with 
`Await.ready(system.whenTerminated, timeout)`
   - Replaced the use of `ActorSystem.isTerminated` with 
`ActorSystem.whenTerminated.isCompleted`
   - Replaced the use of `ActorSystem.shutdown()` with `ActorSystem.terminate()`
   
   
   ## Verifying this change
   
   This change is already covered by existing tests.
   
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): no
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: no
 - The serializers: no
 - The runtime per-record code paths (performance sensitive): no
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: yes
 - The S3 file system connector: no
   
   ## Documentation
   
 - Does this pull request introduce a new feature? no
 - If yes, how is the feature documented? not applicable
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-5315) Support distinct aggregations in table api

2018-07-28 Thread Rong Rong (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-5315?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560803#comment-16560803
 ] 

Rong Rong commented on FLINK-5315:
--

Hmm. Yes that's true we cannot have {{a.distinct}} by itself if {{a}} is column 
expression. But if {{a}} is a table, for example: {{select(*).distinct}}, then 
this is very much valid. 

Yes but I agree, the best way to express this would've been {{ 
agg.distinct('arg1[, 'arg2]...) }} regardless whether this is an expression or 
function since it is a modifier for the method. I think we can definitely do 
that for function. I am ok with the {{'col.agg.distinct}} approach actually 
despite a little confusion. what do you guys think [~fhueske][~twalthr]?

> Support distinct aggregations in table api
> --
>
> Key: FLINK-5315
> URL: https://issues.apache.org/jira/browse/FLINK-5315
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Kurt Young
>Assignee: Rong Rong
>Priority: Major
>
> Such as 
> {code}
> t.select("count(distinct a), sum(b)")
> {code}
> or 
> {code}
> t.select('a.count.distinct), 'b.sum)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9984) Add a byte array table format factory

2018-07-28 Thread xueyu (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560779#comment-16560779
 ] 

xueyu commented on FLINK-9984:
--

I would like to work on this issue, it looks that I can't change assingee to 
me, could you please assigne to me, [~twalthr], thanks

> Add a byte array table format factory
> -
>
> Key: FLINK-9984
> URL: https://issues.apache.org/jira/browse/FLINK-9984
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Timo Walther
>Priority: Major
>
> Sometimes it might be useful to just read or write a plain byte array into 
> Kafka or other connectors. We should add a simple byte array 
> SerializationSchemaFactory and DeserializationSchemaFactory.
> See {{org.apache.flink.formats.json.JsonRowFormatFactory}} for a reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9825) Upgrade checkstyle version to 8.6

2018-07-28 Thread dalongliu (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560778#comment-16560778
 ] 

dalongliu commented on FLINK-9825:
--

[~Zentol], [~mingleizhang],Thank you very much, I will try my best to 
contribute flink.

> Upgrade checkstyle version to 8.6
> -
>
> Key: FLINK-9825
> URL: https://issues.apache.org/jira/browse/FLINK-9825
> Project: Flink
>  Issue Type: Improvement
>  Components: Build System
>Reporter: Ted Yu
>Assignee: dalongliu
>Priority: Minor
>
> We should upgrade checkstyle version to 8.6+ so that we can use the "match 
> violation message to this regex" feature for suppression. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-5315) Support distinct aggregations in table api

2018-07-28 Thread Hequn Cheng (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-5315?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560764#comment-16560764
 ] 

Hequn Cheng commented on FLINK-5315:


[~walterddr] Hi, I think the truth is we can never write a \{{a.distinct}} 
expression. It is not valid. Because there are no aggregations return multi 
rows. It is only valid to do \{{a.count}}. And the difference between 
\{{a.agg.distinct}} and \{{a.agg}} is whether it is a distinct-agg. I think the 
semantic is very clear. What do you think? :)

> Support distinct aggregations in table api
> --
>
> Key: FLINK-5315
> URL: https://issues.apache.org/jira/browse/FLINK-5315
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: Kurt Young
>Assignee: Rong Rong
>Priority: Major
>
> Such as 
> {code}
> t.select("count(distinct a), sum(b)")
> {code}
> or 
> {code}
> t.select('a.count.distinct), 'b.sum)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560751#comment-16560751
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205942899
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @hequn8128  maybe we could not add more functions to support 
`byte/short/int/long` type, I just tried change all the specific `Integer` to 
`Long` about this function, then tested : 
   
   ```
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(f36)",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS SMALLINT))",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS TINYINT))",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS BIGINT))",
 "A")
   ```
   
   all test passed


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205942899
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @hequn8128  maybe we could not add more functions to support 
`byte/short/int/long` type, I just tried change all the specific `Integer` to 
`Long` about this function, then tested : 
   
   ```
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(f36)",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS SMALLINT))",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS TINYINT))",
 "A")
   
   testAllApis(
 'f36.chr(),
 "f36.chr()",
 "CHR(CAST(f36 AS BIGINT))",
 "A")
   ```
   
   all test passed


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread Hequn Cheng (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng updated FLINK-9970:
---
Issue Type: Sub-task  (was: New Feature)
Parent: FLINK-6810

> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-8901) YARN application name for Flink (per-job) submissions claims it is using only 1 TaskManager

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8901?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560739#comment-16560739
 ] 

ASF GitHub Bot commented on FLINK-8901:
---

GJL commented on issue #5754: [FLINK-8901] [yarn] Set proper Yarn application 
name
URL: https://github.com/apache/flink/pull/5754#issuecomment-408604547
 
 
   @arukavytsia Can you open a new issue in JIRA? FLINK-8901 is already 
resolved.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> YARN application name for Flink (per-job) submissions claims it is using only 
> 1 TaskManager
> ---
>
> Key: FLINK-8901
> URL: https://issues.apache.org/jira/browse/FLINK-8901
> Project: Flink
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Nico Kruber
>Assignee: Till Rohrmann
>Priority: Blocker
>  Labels: flip-6, pull-request-available
> Fix For: 1.5.0
>
>
> If (with FLIP-6) a per-job YARN session is created without specifying the 
> number of nodes, it will show up as "Flink session with 1 TaskManagers", e.g. 
> this job:
> {code}
> ./bin/flink run -m yarn-cluster -yjm 768 -ytm 3072 -ys 2 -p 20 -c 
> org.apache.flink.streaming.examples.wordcount.WordCount 
> ./examples/streaming/WordCount.jar --input /usr/share/doc/rsync-3.0.6/COPYING
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] GJL commented on issue #5754: [FLINK-8901] [yarn] Set proper Yarn application name

2018-07-28 Thread GitBox
GJL commented on issue #5754: [FLINK-8901] [yarn] Set proper Yarn application 
name
URL: https://github.com/apache/flink/pull/5754#issuecomment-408604547
 
 
   @arukavytsia Can you open a new issue in JIRA? FLINK-8901 is already 
resolved.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560683#comment-16560683
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205938013
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @yanghua Sorry that haven't make it clear.
   Yes, what i mean is add multi-functions. You probably have to add the 
following changes:
   1. add different functions in `ScalarFunctions.scala`.
   2. add other `addSqlFunctionMethod` in `FunctionGenerator`. 
   ```
 addSqlFunctionMethod(
   CHR,
   Seq(SHORT_TYPE_INFO),
   STRING_TYPE_INFO,
   BuiltInMethods.CHR
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: New Feature
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205938013
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @yanghua Sorry that haven't make it clear.
   Yes, what i mean is add multi-functions. You probably have to add the 
following changes:
   1. add different functions in `ScalarFunctions.scala`.
   2. add other `addSqlFunctionMethod` in `FunctionGenerator`. 
   ```
 addSqlFunctionMethod(
   CHR,
   Seq(SHORT_TYPE_INFO),
   STRING_TYPE_INFO,
   BuiltInMethods.CHR
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560672#comment-16560672
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205938013
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @yanghua Sorry that haven't make it clear.
   Yes, what i mean is add multi-functions. You probably have to add the 
following changes:
   1. add different functions in `ScalarFunctions.scala`.
   2. add other `addSqlFunctionMethod` in `FunctionGenerator`. 
   ```
 addSqlFunctionMethod(
   CHR,
   Seq(LONG_TYPE_INFO),
   STRING_TYPE_INFO,
   BuiltInMethods.CHR
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: New Feature
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
hequn8128 commented on a change in pull request #6432: [FLINK-9970] Add 
ASCII/CHR function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205938013
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   @yanghua Sorry that haven't make it clear.
   Yes, what i mean is add multi-functions. You probably have to add the 
following changes:
   1. add different functions in `ScalarFunctions.scala`.
   2. add other `addSqlFunctionMethod` in `FunctionGenerator`. 
   ```
 addSqlFunctionMethod(
   CHR,
   Seq(LONG_TYPE_INFO),
   STRING_TYPE_INFO,
   BuiltInMethods.CHR
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560671#comment-16560671
 ] 

ASF GitHub Bot commented on FLINK-8302:
---

xueyumusic opened a new pull request #6445: [FLINK-8302][Table API & SQL] Add 
SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445
 
 
   ## What is the purpose of the change
   This PR is based on the previous closed and unfinished work 
https://github.com/apache/flink/pull/5202, it adds shift left and right as core 
scalar operators in then code gen.
   
   The table api syntax is 21.shiftRight(1) and 21.shiftLeft(1)
   The sql syntax is SHIFT_RIGHT(21,1)
   
   ## Brief change log
 - *FunctionCatalog, expressionDsl and SHIFT_RIGHT(21,1)*
 - *CodeGenerator*
   
   ## Verifying this change
   
   This change added tests in ScalarOperatorsTest
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): ( no)
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (no)
 - The serializers: (no)
 - The runtime per-record code paths (performance sensitive): (no)
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
 - The S3 file system connector: (no)
   
   ## Documentation
   
 - Does this pull request introduce a new feature? (yes)
 - If yes, how is the feature documented? (docs)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (FLINK-8302) Support shift_left and shift_right in TableAPI

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-8302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-8302:
--
Labels: features pull-request-available  (was: features)

> Support shift_left and shift_right in TableAPI
> --
>
> Key: FLINK-8302
> URL: https://issues.apache.org/jira/browse/FLINK-8302
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API  SQL
>Reporter: DuBin
>Priority: Major
>  Labels: features, pull-request-available
> Fix For: 1.6.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> Add shift_left and shift_right support in TableAPI, shift_left(input, n) act 
> as input << n, shift_right(input, n) act as input >> n.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] xueyumusic opened a new pull request #6445: [FLINK-8302][Table API & SQL] Add SHIFT_LEFT and SHIFT_RIGHT

2018-07-28 Thread GitBox
xueyumusic opened a new pull request #6445: [FLINK-8302][Table API & SQL] Add 
SHIFT_LEFT and SHIFT_RIGHT
URL: https://github.com/apache/flink/pull/6445
 
 
   ## What is the purpose of the change
   This PR is based on the previous closed and unfinished work 
https://github.com/apache/flink/pull/5202, it adds shift left and right as core 
scalar operators in then code gen.
   
   The table api syntax is 21.shiftRight(1) and 21.shiftLeft(1)
   The sql syntax is SHIFT_RIGHT(21,1)
   
   ## Brief change log
 - *FunctionCatalog, expressionDsl and SHIFT_RIGHT(21,1)*
 - *CodeGenerator*
   
   ## Verifying this change
   
   This change added tests in ScalarOperatorsTest
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): ( no)
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (no)
 - The serializers: (no)
 - The runtime per-record code paths (performance sensitive): (no)
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
 - The S3 file system connector: (no)
   
   ## Documentation
   
 - Does this pull request introduce a new feature? (yes)
 - If yes, how is the feature documented? (docs)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (FLINK-6810) Add Some built-in Scalar Function supported

2018-07-28 Thread Hequn Cheng (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-6810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560665#comment-16560665
 ] 

Hequn Cheng commented on FLINK-6810:


Hi all, I added an instruction about how to contribute a build-in scalar 
function in this Jira description. Hope it helps you when you going to make a 
contribute.

> Add Some built-in Scalar Function supported
> ---
>
> Key: FLINK-6810
> URL: https://issues.apache.org/jira/browse/FLINK-6810
> Project: Flink
>  Issue Type: New Feature
>  Components: Table API  SQL
>Affects Versions: 1.4.0
>Reporter: sunjincheng
>Assignee: sunjincheng
>Priority: Major
>  Labels: starter
>
> In this JIRA, will create some sub-task for add specific scalar function, 
> such as mathematical-function {{LOG}}, date-functions
>  {{DATEADD}},string-functions {{LPAD}}, etc. 
> *How To Contribute a build-in scalar function*
> Thank you very much for contributing a build-in function. In order to make 
> sure your contributions are in a good direction, it is recommended to read 
> the following instructions.
> # Research the behavior of the function which you are going to contribute in 
> major DBMSs. This is very import since we have to understand the exact 
> semantics of the function.
> # It is recommended to add function both for sql and talbe-api.
> # Every scalar function should add TableAPI docs in  
> {{./docs/dev/table/tableApi.md#built-in-functions}}. Add SQL docs in 
> {{./docs/dev/table/sql.md#built-in-functions}}. When add docs for table-api, 
> you should add both scala docs and java docs. Make sure your description of 
> the function is accurate. Please do not copy documentation from other 
> projects. Especially if other projects are not Apache licensed.
> # Take overflow, NullPointerException and other exceptions into consideration.
> Welcome anybody to add the sub-task about standard database scalar function.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (FLINK-6810) Add Some built-in Scalar Function supported

2018-07-28 Thread Hequn Cheng (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-6810?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng updated FLINK-6810:
---
Description: 
In this JIRA, will create some sub-task for add specific scalar function, such 
as mathematical-function {{LOG}}, date-functions
 {{DATEADD}},string-functions {{LPAD}}, etc. 

*How To Contribute a build-in scalar function*
Thank you very much for contributing a build-in function. In order to make sure 
your contributions are in a good direction, it is recommended to read the 
following instructions.
# Research the behavior of the function which you are going to contribute in 
major DBMSs. This is very import since we have to understand the exact 
semantics of the function.
# It is recommended to add function both for sql and talbe-api.
# Every scalar function should add TableAPI docs in  
{{./docs/dev/table/tableApi.md#built-in-functions}}. Add SQL docs in 
{{./docs/dev/table/sql.md#built-in-functions}}. When add docs for table-api, 
you should add both scala docs and java docs. Make sure your description of the 
function is accurate. Please do not copy documentation from other projects. 
Especially if other projects are not Apache licensed.
# Take overflow, NullPointerException and other exceptions into consideration.

Welcome anybody to add the sub-task about standard database scalar function.




  was:
In this JIRA, will create some sub-task for add specific scalar function, such 
as mathematical-function {{LOG}}, date-functions
 {{DATEADD}},string-functions {{LPAD}}, etc. 

I think is good way to let SQL work, and then add TableAPI to supported. So I 
suggest one scalar function create two sub-task, one is for SQL. another for 
TableAPI.

*Note:*
Every scalar function should add TableAPI doc in  
{{./docs/dev/table/tableApi.md#built-in-functions}}. 
Add SQL doc in {{./docs/dev/table/sql.md#built-in-functions}}.

Welcome anybody to add the sub-task about standard database scalar function.





> Add Some built-in Scalar Function supported
> ---
>
> Key: FLINK-6810
> URL: https://issues.apache.org/jira/browse/FLINK-6810
> Project: Flink
>  Issue Type: New Feature
>  Components: Table API  SQL
>Affects Versions: 1.4.0
>Reporter: sunjincheng
>Assignee: sunjincheng
>Priority: Major
>  Labels: starter
>
> In this JIRA, will create some sub-task for add specific scalar function, 
> such as mathematical-function {{LOG}}, date-functions
>  {{DATEADD}},string-functions {{LPAD}}, etc. 
> *How To Contribute a build-in scalar function*
> Thank you very much for contributing a build-in function. In order to make 
> sure your contributions are in a good direction, it is recommended to read 
> the following instructions.
> # Research the behavior of the function which you are going to contribute in 
> major DBMSs. This is very import since we have to understand the exact 
> semantics of the function.
> # It is recommended to add function both for sql and talbe-api.
> # Every scalar function should add TableAPI docs in  
> {{./docs/dev/table/tableApi.md#built-in-functions}}. Add SQL docs in 
> {{./docs/dev/table/sql.md#built-in-functions}}. When add docs for table-api, 
> you should add both scala docs and java docs. Make sure your description of 
> the function is accurate. Please do not copy documentation from other 
> projects. Especially if other projects are not Apache licensed.
> # Take overflow, NullPointerException and other exceptions into consideration.
> Welcome anybody to add the sub-task about standard database scalar function.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9970) Add ASCII/CHR function for table/sql API

2018-07-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-9970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16560643#comment-16560643
 ] 

ASF GitHub Bot commented on FLINK-9970:
---

yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205934831
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   hi @hequn8128 others has accepted and fixed except this. can you give a 
hint? you mean:
   
   * one function and the parameter support multiple types? if support I can 
you Either[Type1, Type2], but more than two, how to handle this? I am not very 
familiar with Scala.
   
   * multiple functions, one support a single type?
   
   * assume supported these types, the `OperandTypes.NUMERIC` can represent 
them?
   
   ```scala
   val CHR = new SqlFunction(
   "CHR",
   SqlKind.OTHER_FUNCTION,
   ReturnTypes.cascade(ReturnTypes.explicit(SqlTypeName.VARCHAR), 
SqlTypeTransforms.TO_NULLABLE),
   null,
   OperandTypes.NUMERIC,
   SqlFunctionCategory.STRING
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ASCII/CHR function for table/sql API
> 
>
> Key: FLINK-9970
> URL: https://issues.apache.org/jira/browse/FLINK-9970
> Project: Flink
>  Issue Type: New Feature
>  Components: Table API  SQL
>Reporter: vinoyang
>Assignee: vinoyang
>Priority: Minor
>  Labels: pull-request-available
>
> for ASCII function : 
> refer to : 
> [https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ascii]
> for CHR function : 
> This function convert ASCII code to a character,
> refer to : [https://doc.ispirer.com/sqlways/Output/SQLWays-1-071.html]
> Considering "CHAR" always is a keyword in many database, so we use "CHR" 
> keyword.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR function for table/sql API

2018-07-28 Thread GitBox
yanghua commented on a change in pull request #6432: [FLINK-9970] Add ASCII/CHR 
function for table/sql API
URL: https://github.com/apache/flink/pull/6432#discussion_r205934831
 
 

 ##
 File path: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/runtime/functions/ScalarFunctions.scala
 ##
 @@ -196,9 +196,35 @@ object ScalarFunctions {
 new String(data)
   }
 
+  /**
+* Returns the numeric value of the leftmost character of the string str.
+* Returns 0 if str is the empty string. Returns NULL if str is NULL.
+*/
+  def ascii(str: String): String = {
+if (str == null) {
+  null
+} else if (str.equals("")) {
+  ""
+} else {
+  str.charAt(0).toByte.toString
+}
+  }
+
   /**
 * Returns the base string decoded with base64.
 */
   def fromBase64(str: String): String = new String(Base64.decodeBase64(str))
 
+  /**
+* Returns string contains a character which converts from a ASCII integer.
+* If the ASCII less then 0 or greater than 255, return null.
+*/
+  def chr(ascii: Integer): String = {
 
 Review comment:
   hi @hequn8128 others has accepted and fixed except this. can you give a 
hint? you mean:
   
   * one function and the parameter support multiple types? if support I can 
you Either[Type1, Type2], but more than two, how to handle this? I am not very 
familiar with Scala.
   
   * multiple functions, one support a single type?
   
   * assume supported these types, the `OperandTypes.NUMERIC` can represent 
them?
   
   ```scala
   val CHR = new SqlFunction(
   "CHR",
   SqlKind.OTHER_FUNCTION,
   ReturnTypes.cascade(ReturnTypes.explicit(SqlTypeName.VARCHAR), 
SqlTypeTransforms.TO_NULLABLE),
   null,
   OperandTypes.NUMERIC,
   SqlFunctionCategory.STRING
 )
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services