[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15481091#comment-15481091
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78288200
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

I also added null check to TwoInputOperator
And removed `input1` and `input2` fields from DefaultCross because 
TwoInputOperator already has them


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread apivovarov
Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78288200
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

I also added null check to TwoInputOperator
And removed `input1` and `input2` fields from DefaultCross because 
TwoInputOperator already has them


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (FLINK-4611) Make "AUTO" credential provider as default for Kinesis Connector

2016-09-10 Thread Tzu-Li (Gordon) Tai (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-4611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tzu-Li (Gordon) Tai updated FLINK-4611:
---
Priority: Major  (was: Minor)

> Make "AUTO" credential provider as default for Kinesis Connector
> 
>
> Key: FLINK-4611
> URL: https://issues.apache.org/jira/browse/FLINK-4611
> Project: Flink
>  Issue Type: Improvement
>  Components: Kinesis Connector
>Reporter: Tzu-Li (Gordon) Tai
> Fix For: 1.2.0
>
>
> Right now, the Kinesis Consumer / Producer by default directly expects the 
> access key id and secret access key to be given in the config properties.
> This isn't a good practice for accessing AWS services, and usually Kinesis 
> users would most likely be running their Flink application in AWS instances 
> that have embedded credentials that can be access via the default credential 
> provider chain. Therefore, it makes sense to change the default 
> {{AWS_CREDENTIALS_PROVIDER}} to {{AUTO}} instead of {{BASIC}}.
> To avoid breaking user code, we only use directly supplied AWS credentials if 
> both access key and secret key is given through {{AWS_ACCESS_KEY}} and 
> {{AWS_SECRET_KEY}}. Otherwise, the default credential provider chain is used.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-4611) Make "AUTO" credential provider as default for Kinesis Connector

2016-09-10 Thread Tzu-Li (Gordon) Tai (JIRA)
Tzu-Li (Gordon) Tai created FLINK-4611:
--

 Summary: Make "AUTO" credential provider as default for Kinesis 
Connector
 Key: FLINK-4611
 URL: https://issues.apache.org/jira/browse/FLINK-4611
 Project: Flink
  Issue Type: Improvement
  Components: Kinesis Connector
Reporter: Tzu-Li (Gordon) Tai
Priority: Minor
 Fix For: 1.2.0


Right now, the Kinesis Consumer / Producer by default directly expects the 
access key id and secret access key to be given in the config properties.

This isn't a good practice for accessing AWS services, and usually Kinesis 
users would most likely be running their Flink application in AWS instances 
that have embedded credentials that can be access via the default credential 
provider chain. Therefore, it makes sense to change the default 
{{AWS_CREDENTIALS_PROVIDER}} to {{AUTO}} instead of {{BASIC}}.

To avoid breaking user code, we only use directly supplied AWS credentials if 
both access key and secret key is given through {{AWS_ACCESS_KEY}} and 
{{AWS_SECRET_KEY}}. Otherwise, the default credential provider chain is used.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4602) Move RocksDB backed to proper package

2016-09-10 Thread Tzu-Li (Gordon) Tai (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4602?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15481024#comment-15481024
 ] 

Tzu-Li (Gordon) Tai commented on FLINK-4602:


Does this also mean we will be moving the rocksdb statebackend out of 
{{flink-contrib}} and to a separate new module, perhaps {{flink-statebackends}} 
with a structure like the flink connectors?

> Move RocksDB backed to proper package
> -
>
> Key: FLINK-4602
> URL: https://issues.apache.org/jira/browse/FLINK-4602
> Project: Flink
>  Issue Type: Sub-task
>Reporter: Aljoscha Krettek
> Fix For: 2.0.0
>
>
> Right now the package is {{org.apache.flink.contrib.streaming.state}}, it 
> should probably be in {{org.apache.flink.runtime.state.rocksdb}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread apivovarov
Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78287335
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

DefaultCross calls `input1.getType()` and `input2.getType()` before calling 
super() on line 134. So, if we add null check to super class (e.g. 
TwoInputOperator) it will not work for DefaultCross


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480951#comment-15480951
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78287335
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

DefaultCross calls `input1.getType()` and `input2.getType()` before calling 
super() on line 134. So, if we add null check to super class (e.g. 
TwoInputOperator) it will not work for DefaultCross


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480755#comment-15480755
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user greghogan commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78286377
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

Can we do the preconditions check in `TwoInputOperator` rather than the 
subclasses?


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread greghogan
Github user greghogan commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78286377
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -129,14 +129,11 @@ private String getDefaultName() {
 
public DefaultCross(DataSet input1, DataSet input2, 
CrossHint hint, String defaultName) {

-   super(input1, input2, new DefaultCrossFunction(),
+   super(Preconditions.checkNotNull(input1, "input1 is 
null"),
--- End diff --

Can we do the preconditions check in `TwoInputOperator` rather than the 
subclasses?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480602#comment-15480602
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78285322
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

Ok, I added null check for input1 and input2 with a message inline with 
calling super in DefaultCross


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread apivovarov
Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78285322
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

Ok, I added null check for input1 and input2 with a message inline with 
calling super in DefaultCross


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480599#comment-15480599
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user apivovarov commented on the issue:

https://github.com/apache/flink/pull/2490
  
Ok, I added null check for input1 and input2 with a message before calling 
super in DefaultCross



> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2490: [FLINK-4609] Remove redundant check for null in CrossOper...

2016-09-10 Thread apivovarov
Github user apivovarov commented on the issue:

https://github.com/apache/flink/pull/2490
  
Ok, I added null check for input1 and input2 with a message before calling 
super in DefaultCross



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4610) Replace keySet/getValue with entrySet in UdfAnalyzerUtils

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4610?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480465#comment-15480465
 ] 

ASF GitHub Bot commented on FLINK-4610:
---

Github user apivovarov commented on the issue:

https://github.com/apache/flink/pull/2491
  
@twalthr Can you look at this PR?


> Replace keySet/getValue with entrySet in UdfAnalyzerUtils
> -
>
> Key: FLINK-4610
> URL: https://issues.apache.org/jira/browse/FLINK-4610
> Project: Flink
>  Issue Type: Improvement
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Minor
>
> Map.keySet + getValue combination is not very efficient
> It can be replaced with Map.entrySet to avoid values lookup
> Also MapEntry allows to modify MapEntry value while iterating thought the map 
> using entrySet.iterator



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2491: [FLINK-4610] Replace keySet/getValue with entrySet in Udf...

2016-09-10 Thread apivovarov
Github user apivovarov commented on the issue:

https://github.com/apache/flink/pull/2491
  
@twalthr Can you look at this PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480437#comment-15480437
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78283727
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

yes, but if you used the preconditions check you could supply a useful 
error message, for example stating which input was actually null.


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread zentol
Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78283727
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

yes, but if you used the preconditions check you could supply a useful 
error message, for example stating which input was actually null.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15480384#comment-15480384
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78283317
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

if input1 or/and input2 are null CrossOperator will throw NPE on line 133

I also added check for null to TwoInputOperator



> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread apivovarov
Github user apivovarov commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78283317
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

if input1 or/and input2 are null CrossOperator will throw NPE on line 133

I also added check for null to TwoInputOperator



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4148) incorrect calculation distance in QuadTree

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4148?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479369#comment-15479369
 ] 

ASF GitHub Bot commented on FLINK-4148:
---

Github user xhumanoid commented on the issue:

https://github.com/apache/flink/pull/2442
  
slave was killed without any notifications:

ERROR: Maven JVM terminated unexpectedly with exit code 137
Putting comment on the pull request
Finished: FAILURE


> incorrect calculation distance in QuadTree
> --
>
> Key: FLINK-4148
> URL: https://issues.apache.org/jira/browse/FLINK-4148
> Project: Flink
>  Issue Type: Bug
>Reporter: Alexey Diomin
>Priority: Trivial
> Attachments: 
> 0001-FLINK-4148-incorrect-calculation-minDist-distance-in.patch
>
>
> https://github.com/apache/flink/blob/master/flink-libraries/flink-ml/src/main/scala/org/apache/flink/ml/nn/QuadTree.scala#L105
> Because EuclideanDistanceMetric extends SquaredEuclideanDistanceMetric we 
> always move in first case and never reach case for math.sqrt(minDist)
> correct match first EuclideanDistanceMetric and after it 
> SquaredEuclideanDistanceMetric
> p.s. because EuclideanDistanceMetric more compute expensive and stay as 
> default DistanceMetric it's can cause some performance degradation for KNN on 
> default parameters



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2442: [FLINK-4148] incorrect calculation minDist distance in Qu...

2016-09-10 Thread xhumanoid
Github user xhumanoid commented on the issue:

https://github.com/apache/flink/pull/2442
  
slave was killed without any notifications:

ERROR: Maven JVM terminated unexpectedly with exit code 137
Putting comment on the pull request
Finished: FAILURE


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #2490: [FLINK-4609] Remove redundant check for null in Cr...

2016-09-10 Thread zentol
Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78273103
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

we could instead move the null check into the `super()` call using 
`Preconditions.checkNotNull()`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4609) Remove redundant check for null in CrossOperator

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479356#comment-15479356
 ] 

ASF GitHub Bot commented on FLINK-4609:
---

Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/2490#discussion_r78273103
  
--- Diff: 
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java 
---
@@ -133,10 +133,6 @@ public DefaultCross(DataSet input1, DataSet 
input2, CrossHint hint, Stri
new TupleTypeInfo>(input1.getType(), input2.getType()),
hint, defaultName);
 
-   if (input1 == null || input2 == null) {
-   throw new NullPointerException();
-   }
-
--- End diff --

we could instead move the null check into the `super()` call using 
`Preconditions.checkNotNull()`


> Remove redundant check for null in CrossOperator
> 
>
> Key: FLINK-4609
> URL: https://issues.apache.org/jira/browse/FLINK-4609
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> CrossOperator checks input1 and input2 for null after they were dereferenced



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4608) Use short-circuit AND in Max/Min AggregationFunction

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479354#comment-15479354
 ] 

ASF GitHub Bot commented on FLINK-4608:
---

Github user zentol commented on the issue:

https://github.com/apache/flink/pull/2489
  
+1 to merge


> Use short-circuit AND in Max/Min AggregationFunction
> 
>
> Key: FLINK-4608
> URL: https://issues.apache.org/jira/browse/FLINK-4608
> Project: Flink
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> Max/Min AggregationFunction use & instead of &&. Usually we use short-circuit 
> logic in if operators in java



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2489: [FLINK-4608] Use short-circuit AND in Max/Min Aggregation...

2016-09-10 Thread zentol
Github user zentol commented on the issue:

https://github.com/apache/flink/pull/2489
  
+1 to merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4607) Close FileInputStream in ParameterTool and other

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479351#comment-15479351
 ] 

ASF GitHub Bot commented on FLINK-4607:
---

Github user zentol commented on the issue:

https://github.com/apache/flink/pull/2488
  
+1 to merge


> Close FileInputStream in ParameterTool and other
> 
>
> Key: FLINK-4607
> URL: https://issues.apache.org/jira/browse/FLINK-4607
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Trivial
>
> ParameterTool and some tests do not close FileInputStream
> {code}
> flink-core/src/test/java/org/apache/flink/core/fs/local/LocalFileSystemTest.java
> flink-java/src/main/java/org/apache/flink/api/java/utils/ParameterTool.java
> flink-java/src/test/java/org/apache/flink/api/java/utils/ParameterToolTest.java
> flink-java8/src/test/java/org/apache/flink/runtime/util/JarFileCreatorLambdaTest.java
> flink-runtime/src/test/java/org/apache/flink/runtime/util/JarFileCreatorTest.java
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2488: [FLINK-4607] Close FileInputStream in ParameterTool and o...

2016-09-10 Thread zentol
Github user zentol commented on the issue:

https://github.com/apache/flink/pull/2488
  
+1 to merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4610) Replace keySet/getValue with entrySet in UdfAnalyzerUtils

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4610?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479259#comment-15479259
 ] 

ASF GitHub Bot commented on FLINK-4610:
---

GitHub user apivovarov opened a pull request:

https://github.com/apache/flink/pull/2491

[FLINK-4610] Replace keySet/getValue with entrySet in UdfAnalyzerUtils

https://issues.apache.org/jira/browse/FLINK-4610

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/apivovarov/flink FLINK-4610

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2491.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2491


commit 7440989c09fae5325dbb3cebf0cf9d10f59dcbdd
Author: Alexander Pivovarov 
Date:   2016-09-10T06:10:12Z

[FLINK-4610] Replace keySet/getValue with entrySet in UdfAnalyzerUtils




> Replace keySet/getValue with entrySet in UdfAnalyzerUtils
> -
>
> Key: FLINK-4610
> URL: https://issues.apache.org/jira/browse/FLINK-4610
> Project: Flink
>  Issue Type: Improvement
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Minor
>
> Map.keySet + getValue combination is not very efficient
> It can be replaced with Map.entrySet to avoid values lookup
> Also MapEntry allows to modify MapEntry value while iterating thought the map 
> using entrySet.iterator



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink pull request #2491: [FLINK-4610] Replace keySet/getValue with entrySet...

2016-09-10 Thread apivovarov
GitHub user apivovarov opened a pull request:

https://github.com/apache/flink/pull/2491

[FLINK-4610] Replace keySet/getValue with entrySet in UdfAnalyzerUtils

https://issues.apache.org/jira/browse/FLINK-4610

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/apivovarov/flink FLINK-4610

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2491.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2491


commit 7440989c09fae5325dbb3cebf0cf9d10f59dcbdd
Author: Alexander Pivovarov 
Date:   2016-09-10T06:10:12Z

[FLINK-4610] Replace keySet/getValue with entrySet in UdfAnalyzerUtils




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (FLINK-4610) Replace keySet/getValue with entrySet in UdfAnalyzerUtils

2016-09-10 Thread Alexander Pivovarov (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-4610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Pivovarov updated FLINK-4610:
---
Description: 
Map.keySet + getValue combination is not very efficient
It can be replaced with Map.entrySet to avoid values lookup

Also MapEntry allows to modify MapEntry value while iterating thought the map 
using entrySet.iterator

  was:
Map.keySet + getValue combination is not very efficient
It can be replaced with Map.entrySet to avoid values lookup

Also MapEntry allows to modify Map values while iterating thought the map using 
entrySet.iterator


> Replace keySet/getValue with entrySet in UdfAnalyzerUtils
> -
>
> Key: FLINK-4610
> URL: https://issues.apache.org/jira/browse/FLINK-4610
> Project: Flink
>  Issue Type: Improvement
>  Components: Java API
>Affects Versions: 1.1.2
>Reporter: Alexander Pivovarov
>Priority: Minor
>
> Map.keySet + getValue combination is not very efficient
> It can be replaced with Map.entrySet to avoid values lookup
> Also MapEntry allows to modify MapEntry value while iterating thought the map 
> using entrySet.iterator



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-4610) Replace keySet/getValue with entrySet in UdfAnalyzerUtils

2016-09-10 Thread Alexander Pivovarov (JIRA)
Alexander Pivovarov created FLINK-4610:
--

 Summary: Replace keySet/getValue with entrySet in UdfAnalyzerUtils
 Key: FLINK-4610
 URL: https://issues.apache.org/jira/browse/FLINK-4610
 Project: Flink
  Issue Type: Improvement
  Components: Java API
Affects Versions: 1.1.2
Reporter: Alexander Pivovarov
Priority: Minor


Map.keySet + getValue combination is not very efficient
It can be replaced with Map.entrySet to avoid values lookup

Also MapEntry allows to modify Map values while iterating thought the map using 
entrySet.iterator



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] flink issue #2487: [FLINK-4520][flink-siddhi] Integrate Siddhi as a light-we...

2016-09-10 Thread haoch
Github user haoch commented on the issue:

https://github.com/apache/flink/pull/2487
  
@apivovarov thanks very much for the comments. I have formatted all code as 
required. Pls. kindly help continue reviewing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (FLINK-4520) Integrate Siddhi as a lightweight CEP Library

2016-09-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4520?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15479245#comment-15479245
 ] 

ASF GitHub Bot commented on FLINK-4520:
---

Github user haoch commented on the issue:

https://github.com/apache/flink/pull/2487
  
@apivovarov thanks very much for the comments. I have formatted all code as 
required. Pls. kindly help continue reviewing.


> Integrate Siddhi as a lightweight CEP Library
> -
>
> Key: FLINK-4520
> URL: https://issues.apache.org/jira/browse/FLINK-4520
> Project: Flink
>  Issue Type: New Feature
>  Components: CEP
>Affects Versions: 1.2.0
>Reporter: Hao Chen
>  Labels: cep, library, patch-available
> Fix For: 1.2.0
>
>
> h1. flink-siddhi proposal
> h2. Abstraction
> Siddhi CEP is a lightweight and easy-to-use Open Source Complex Event 
> Processing Engine (CEP) released as a Java Library under `Apache Software 
> License v2.0`. Siddhi CEP processes events which are generated by various 
> event sources, analyses them and notifies appropriate complex events 
> according to the user specified queries. 
> It would be very helpful for flink users (especially streaming application 
> developer) to provide a library to run Siddhi CEP query directly in Flink 
> streaming application.
> * http://wso2.com/products/complex-event-processor/
> * https://github.com/wso2/siddhi
> h2. Features
> * Integrate Siddhi CEP as an stream operator (i.e. 
> `TupleStreamSiddhiOperator`), supporting rich CEP features like
> * Filter
> * Join
> * Aggregation
> * Group by
> * Having
> * Window
> * Conditions and Expressions
> * Pattern processing
> * Sequence processing
> * Event Tables
> ...
> * Provide easy-to-use Siddhi CEP API to integrate Flink DataStream API (See 
> `SiddhiCEP` and `SiddhiStream`)
> * Register Flink DataStream associating native type information with 
> Siddhi Stream Schema, supporting POJO,Tuple, Primitive Type, etc.
> * Connect with single or multiple Flink DataStreams with Siddhi CEP 
> Execution Plan
> * Return output stream as DataStream with type intelligently inferred 
> from Siddhi Stream Schema
> * Integrate siddhi runtime state management with Flink state (See 
> `AbstractSiddhiOperator`)
> * Support siddhi plugin management to extend CEP functions. (See 
> `SiddhiCEP#registerExtension`)
> h2. Test Cases 
> * org.apache.flink.contrib.siddhi.SiddhiCEPITCase: 
> https://github.com/haoch/flink/blob/FLINK-4520/flink-contrib/flink-siddhi/src/test/java/org/apache/flink/contrib/siddhi/SiddhiCEPITCase.java
> h2. Example
> {code}
>  StreamExecutionEnvironment env = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>  SiddhiCEP cep = SiddhiCEP.getSiddhiEnvironment(env);
>  cep.registerExtension("custom:plus",CustomPlusFunctionExtension.class);
>  cep.registerStream("inputStream1", input1, "id", "name", 
> "price","timestamp");
>  cep.registerStream("inputStream2", input2, "id", "name", 
> "price","timestamp");
>  DataStream> output = cep
>   .from("inputStream1").union("inputStream2")
>   .sql(
> "from every s1 = inputStream1[id == 2] "
>  + " -> s2 = inputStream2[id == 3] "
>  + "select s1.id as id_1, s1.name as name_1, s2.id as id_2, s2.name as 
> name_2 , custom:plus(s1.price,s2.price) as price"
>  + "insert into outputStream"
>   )
>   .returns("outputStream");
>  env.execute();
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)