[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-03-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/3338


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread haohui
Github user haohui commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r102125031
  
--- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala
 ---
@@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
   "DataSetCalc",
   batchTableNode(0),
   term("select",
-"13 AS _c0",
+"CAST(13) AS _c0",
--- End diff --

Just played around a little bit. I think the problem is that the advanced 
types are not properly canonized. Using the following diff can pass all tests 
in `ArrayTypeTest`:

```
--- 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
+++ 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
@@ -133,12 +133,18 @@ class FlinkTypeFactory(typeSystem: RelDataTypeSystem) 
extends JavaTypeFactoryImp
   override def createTypeWithNullability(
 relDataType: RelDataType,
 nullable: Boolean)
-  : RelDataType = relDataType match {
-case composite: CompositeRelDataType =>
-  // at the moment we do not care about nullability
-  composite
-case _ =>
-  super.createTypeWithNullability(relDataType, nullable)
+  : RelDataType = {
+val t = relDataType match {
+  case composite: CompositeRelDataType =>
+// at the moment we do not care about nullability
+composite
+  case array: ArrayRelDataType =>
+val elementType = 
canonize(createTypeWithNullability(array.getComponentType, nullable))
+new ArrayRelDataType(array.typeInfo, elementType, nullable)
+  case _ =>
+super.createTypeWithNullability(relDataType, nullable)
+}
+canonize(t)
   }
 }
```

GroupWindowTest is still failing as it misses an identity projection. I'm 
wondering why `ProjectRemoveRule.INSTANCE` did not kick in...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread haohui
Github user haohui commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r102117236
  
--- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala
 ---
@@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
   "DataSetCalc",
   batchTableNode(0),
   term("select",
-"13 AS _c0",
+"CAST(13) AS _c0",
--- End diff --

Is it possible to not changing the default nullability while adopting 
Calcite 1.11? Let me try it out as well.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread twalthr
Github user twalthr commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r102046284
  
--- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala
 ---
@@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
   "DataSetCalc",
   batchTableNode(0),
   term("select",
-"13 AS _c0",
+"CAST(13) AS _c0",
--- End diff --

I will have a look at it again. In general, the only real solution is 
finally fix FLINK-5177.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread wuchong
Github user wuchong commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r102040448
  
--- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala
 ---
@@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
   "DataSetCalc",
   batchTableNode(0),
   term("select",
-"13 AS _c0",
+"CAST(13) AS _c0",
--- End diff --

Yes, because I changed the default nullable to `true`, but the reduced 
constant is `NOT NULL`, so a `CAST` is here. Do you have any ideas to fix this? 

The default nullable changed to `true` is because 
`UserDefinedScalarFunctionTest.testResults` and 
`ArrayTypeTest.testArrayLiterals` fail.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread twalthr
Github user twalthr commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r101977305
  
--- Diff: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/ProjectionTranslator.scala
 ---
@@ -108,62 +109,73 @@ object ProjectionTranslator {
   tableEnv: TableEnvironment,
   aggNames: Map[Expression, String],
   propNames: Map[Expression, String]): Seq[NamedExpression] = {
-exprs.map(replaceAggregationsAndProperties(_, tableEnv, aggNames, 
propNames))
-.map(UnresolvedAlias)
-  }
 
-  private def replaceAggregationsAndProperties(
+val projectNames: mutable.HashSet[String] = new mutable.HashSet[String]
+
+def replaceAggregationsAndProperties(
--- End diff --

Can you rename this or the outer function? Having two functions with the 
same names is confusing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-20 Thread twalthr
Github user twalthr commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r101977849
  
--- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala
 ---
@@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
   "DataSetCalc",
   batchTableNode(0),
   term("select",
-"13 AS _c0",
+"CAST(13) AS _c0",
--- End diff --

Do you know why there are so many unnecessary casts? Is it because of the 
different nullability?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-17 Thread wuchong
Github user wuchong commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r101750708
  
--- Diff: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala
 ---
@@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
 case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
   reducedValues.add(unreduced)
 case _ =>
+  val reducedValue = reduced.getField(reducedIdx)
+  // RexBuilder handle boolean literal incorrectly, convert it 
into BigDecimal manually
--- End diff --

Yes, it should be `double`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-17 Thread fhueske
Github user fhueske commented on a diff in the pull request:

https://github.com/apache/flink/pull/3338#discussion_r101728056
  
--- Diff: 
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala
 ---
@@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
 case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
   reducedValues.add(unreduced)
 case _ =>
+  val reducedValue = reduced.getField(reducedIdx)
+  // RexBuilder handle boolean literal incorrectly, convert it 
into BigDecimal manually
--- End diff --

Can you check the comment? Shouldn't "boolean" be "double"?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

2017-02-17 Thread wuchong
GitHub user wuchong opened a pull request:

https://github.com/apache/flink/pull/3338

[FLINK-5414] [table] Bump up Calcite version to 1.11

Thanks for contributing to Apache Flink. Before you open your pull request, 
please take the following check list into consideration.
If your changes take all of the items into account, feel free to open your 
pull request. For more information and/or questions please refer to the [How To 
Contribute guide](http://flink.apache.org/how-to-contribute.html).
In addition to going through the list, please provide a meaningful 
description of your changes.

- [ ] General
  - The pull request references the related JIRA issue ("[FLINK-XXX] Jira 
title text")
  - The pull request addresses only one issue
  - Each commit in the PR has a meaningful commit message (including the 
JIRA id)

- [ ] Documentation
  - Documentation has been added for new functionality
  - Old documentation affected by the pull request has been updated
  - JavaDoc for public methods has been added

- [ ] Tests & Build
  - Functionality added by the pull request is covered by tests
  - `mvn clean verify` has been executed successfully locally or a Travis 
build has passed


This PR upgrade Calcite to version 1.11. But there are a lot of 
compatibility issues. I fixed them. Correct me if the way of fixing is wrong.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wuchong/flink calcite-FLINK-5414

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/3338.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3338


commit d3d4c914325c1591c1bd5b0e5081df020cda0507
Author: Jark Wu 
Date:   2017-02-17T05:17:43Z

[FLINK-5414] [table] Bump up Calcite version to 1.11




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---