[jira] [Commented] (FLINK-3656) Rework Table API tests

2017-11-16 Thread Fabian Hueske (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16255075#comment-16255075
 ] 

Fabian Hueske commented on FLINK-3656:
--

There's still a bit of duplication, but I agree. We can close this parent issue 
for now.

> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The {{flink-table}} component consists of 
> several APIs 
>   * Scala-embedded Table API
>   * String-based Table API (for Java)
>   * SQL 
> and compiles to two execution backends:
>   * DataStream API
>   * DataSet API
> There are many different translation paths involved until a query is executed:
>   # Table API String -> Table API logical plan
>   # Table API Scala-expressions -> Table API logical plan
>   # Table API logical plan -> Calcite RelNode plans
>   # SQL -> Calcite RelNode plans (done by exclusively via Calcite)
>   # Calcite RelNodes -> DataSet RelNodes
>   # DataSet RelNodes -> DataSet program
>   # Calcite RelNodes -> DataStream RelNodes
>   # DataStream RelNodes -> DataStream program
>   # Calcite RexNode expressions -> generated code
> which need to be thoroughly tested.
> Initially, many tests were done as end-to-end integration tests with high 
> overhead.
> However, due to the combinations of APIs and execution back-ends, this 
> approach causes many redundant tests and long build times.
> Therefore, I propose the following testing scheme:
> 1. Table API String -> Table API expression: 
> The String-based Table API is tested by comparing the resulting logical plan 
> (Table.logicalPlan) to the logical plan of an equivalent Table program that 
> uses the Scala-embedded syntax. The logical plan is the Table API internal 
> representation which is later converted into a Calcite RelNode plan.
> All existing integration tests that check the "Java" Table API should be 
> ported to unit tests. There will also be duplicated tests because, the Java 
> Table API is tested for batch and streaming which is not necessary anymore.
> 2. Table API Scala-expressions -> Table API logical plan -> Calcite RelNodes 
> -> DataSet RelNodes / DataStream RelNodes
> These tests cover the translation and optimization of Table API queries and 
> verify the Calcite optimized plan. We need distinct tests for DataSet and 
> DataStream environments since features and translation rules vary. These test 
> will also identify if added or modified rules or cost functions result in 
> different plans. These should be the main tests for the Table API and very 
> extensive. 
> These tests should be implemented by extending the {{TableTestBase}} which is 
> a base class for unit tests and hence very lightweight.
> 3. SQL -> Calcite RelNodes -> DataSet RelNodes / DataStream RelNodes
> These are the same tests as described for 2. (Table API Scala-expressions -> 
> DataSet / DataStream RelNodes) but just for SQL.
> 4. DataSet RelNode -> DataSet program
> Unfortunately, the DataSet API lacks a good mechanism to test generated 
> programs, i.e., get a plan traversable of all operators with access to all 
> user-defined functions. Until such a testing utility is available, I propose 
> to test the translation to DataSet programs as end-to-end integration tests. 
> However, I think we can run most tests on a Collection ExecutionEnvironment, 
> which does not start a Flink cluster but runs all code on Java collections. 
> This makes these tests much more lightweight than cluster-based ITCases. The 
> goal of these tests should be to cover all translation paths from DataSetRel 
> to DataSet program, i.e., all DataSetRel nodes and their translation logic. 
> These tests should be implemented by extending the 
> {{TableProgramsCollectionTestBase}} (see FLINK-5268).
> Moreover, we should have very few cluster-based ITCases in place that check 
> the execution path with the actual operators, serializers, and comparators. 
> However, we should limit these tests to the minimum to keep build time low. 
> These tests should be implemented by extending the 
> {{TableProgramsClusterTestBase}} (FLINK-5268) and all be located in the same 
> class to avoid repeated instantiation of the Flink MiniCluster.
> 5. DataStream RelNode -> DataStream program
> Here basically the same applies as for the DataSet programs. I'm not aware of 
> a good way to test generated DataStream programs without executing them. A 
> testing utility would be great for all libraries that are built on top of the 
> API. Until then, I propose to use end-to-end integration tests. 
> Unfortunately, the DataStream API does not feature a collection execution 
> mode, so 

[jira] [Commented] (FLINK-3656) Rework Table API tests

2017-11-16 Thread Timo Walther (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16255058#comment-16255058
 ] 

Timo Walther commented on FLINK-3656:
-

[~fhueske] I think we can close this issue. What do you think? Move of the 
problems that we had in the past have been improved in FLINK-6617. 

> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The {{flink-table}} component consists of 
> several APIs 
>   * Scala-embedded Table API
>   * String-based Table API (for Java)
>   * SQL 
> and compiles to two execution backends:
>   * DataStream API
>   * DataSet API
> There are many different translation paths involved until a query is executed:
>   # Table API String -> Table API logical plan
>   # Table API Scala-expressions -> Table API logical plan
>   # Table API logical plan -> Calcite RelNode plans
>   # SQL -> Calcite RelNode plans (done by exclusively via Calcite)
>   # Calcite RelNodes -> DataSet RelNodes
>   # DataSet RelNodes -> DataSet program
>   # Calcite RelNodes -> DataStream RelNodes
>   # DataStream RelNodes -> DataStream program
>   # Calcite RexNode expressions -> generated code
> which need to be thoroughly tested.
> Initially, many tests were done as end-to-end integration tests with high 
> overhead.
> However, due to the combinations of APIs and execution back-ends, this 
> approach causes many redundant tests and long build times.
> Therefore, I propose the following testing scheme:
> 1. Table API String -> Table API expression: 
> The String-based Table API is tested by comparing the resulting logical plan 
> (Table.logicalPlan) to the logical plan of an equivalent Table program that 
> uses the Scala-embedded syntax. The logical plan is the Table API internal 
> representation which is later converted into a Calcite RelNode plan.
> All existing integration tests that check the "Java" Table API should be 
> ported to unit tests. There will also be duplicated tests because, the Java 
> Table API is tested for batch and streaming which is not necessary anymore.
> 2. Table API Scala-expressions -> Table API logical plan -> Calcite RelNodes 
> -> DataSet RelNodes / DataStream RelNodes
> These tests cover the translation and optimization of Table API queries and 
> verify the Calcite optimized plan. We need distinct tests for DataSet and 
> DataStream environments since features and translation rules vary. These test 
> will also identify if added or modified rules or cost functions result in 
> different plans. These should be the main tests for the Table API and very 
> extensive. 
> These tests should be implemented by extending the {{TableTestBase}} which is 
> a base class for unit tests and hence very lightweight.
> 3. SQL -> Calcite RelNodes -> DataSet RelNodes / DataStream RelNodes
> These are the same tests as described for 2. (Table API Scala-expressions -> 
> DataSet / DataStream RelNodes) but just for SQL.
> 4. DataSet RelNode -> DataSet program
> Unfortunately, the DataSet API lacks a good mechanism to test generated 
> programs, i.e., get a plan traversable of all operators with access to all 
> user-defined functions. Until such a testing utility is available, I propose 
> to test the translation to DataSet programs as end-to-end integration tests. 
> However, I think we can run most tests on a Collection ExecutionEnvironment, 
> which does not start a Flink cluster but runs all code on Java collections. 
> This makes these tests much more lightweight than cluster-based ITCases. The 
> goal of these tests should be to cover all translation paths from DataSetRel 
> to DataSet program, i.e., all DataSetRel nodes and their translation logic. 
> These tests should be implemented by extending the 
> {{TableProgramsCollectionTestBase}} (see FLINK-5268).
> Moreover, we should have very few cluster-based ITCases in place that check 
> the execution path with the actual operators, serializers, and comparators. 
> However, we should limit these tests to the minimum to keep build time low. 
> These tests should be implemented by extending the 
> {{TableProgramsClusterTestBase}} (FLINK-5268) and all be located in the same 
> class to avoid repeated instantiation of the Flink MiniCluster.
> 5. DataStream RelNode -> DataStream program
> Here basically the same applies as for the DataSet programs. I'm not aware of 
> a good way to test generated DataStream programs without executing them. A 
> testing utility would be great for all libraries that are built on top of the 
> API. Until then, I propose to use end-to-end integration tests. 
> Unfortunately, the DataStream API 

[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-10-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15563130#comment-15563130
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/2595


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-10-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15562200#comment-15562200
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/2595
  
Merging


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-10-10 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15561907#comment-15561907
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/2595
  
Looks good @twalthr.
+1 to merge


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-10-05 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15548219#comment-15548219
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

GitHub user twalthr opened a pull request:

https://github.com/apache/flink/pull/2595

[FLINK-3656] [table] Test base for logical unit testing

This PR introduces a test base for logical unit testing. It could replace 
most of current ITCases as it checks if the API is correctly translated into 
`DataSet`/`DataStream` RelNodes. The translation to DataSet/DataStream 
operators should be tested separately by unit tests for those (e.g. 
`DataSetCalcTest`, `DataSetJoinTest` etc.).

Here is an example how a JointITCase could be converted:
```scala
@Test
def testJoin(): Unit = {
val util = batchTestUtil()
val in1 = util.addTable[(Int, Long, String)]("Left", 'a, 'b, 'c)
val in2 = util.addTable[(Int, Long, Int, String, Long)]("Right", 'd, 'e, 
'f, 'g, 'h)

val table = in1.join(in2).where("b === e").select("c, g")

val expected = unaryNode(
  "DataSetCalc",
  binaryNode(
"DataSetJoin",
batchTableNode(0),
batchTableNode(1),
term("where", "=(b, e)"),
term("join", "a", "b", "c", "d", "e", "f", "g", "h")
  ),
  term("joinType", "Join")
)

util.verifyTable(table, expected)
}
```



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/twalthr/flink FLINK-3656_UNIT_TEST_UTILS

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2595.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2595


commit c384628cd1f48f993973bb2e831720a98d55ca16
Author: twalthr 
Date:   2016-10-04T16:32:52Z

[FLINK-3656] [table] Test base for logical unit testing




> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-10-04 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15544954#comment-15544954
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/2567


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-09-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15532882#comment-15532882
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

GitHub user twalthr opened a pull request:

https://github.com/apache/flink/pull/2567

[FLINK-3656] [table] Convert expression tests to unit tests

This PR replaces 6 ITCases with unit tests in `ScalarOperatorsTest`.

It reduces the build time by about 10 seconds.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/twalthr/flink FLINK-3656_STEP_3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2567.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2567


commit 851ae1189252bf359bdf4e0177c3f8968543a87a
Author: twalthr 
Date:   2016-09-29T12:54:36Z

[FLINK-3656] [table] Convert expression tests to unit tests




> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-09-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15532818#comment-15532818
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/2566


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-09-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15532332#comment-15532332
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

GitHub user twalthr opened a pull request:

https://github.com/apache/flink/pull/2566

[FLINK-3656] [table] Consolidate ITCases

This PR merges ITCases that are belong together logically. It also ensures 
that all tests extend from TableProgramsTest base.

It reduces the build time by 20 seconds.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/twalthr/flink FLINK-3656_STEP_2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2566.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2566


commit afe77695e852171bdff7f238e7bb5951eda5a8ac
Author: twalthr 
Date:   2016-09-29T08:20:35Z

Merge FilterIT/SelectIT to CalcITCases

commit 0f7f080dd058cf0fe2298d34510c6cf2a6bca09b
Author: twalthr 
Date:   2016-09-29T09:04:34Z

Merge FromDataSet/ToTable to TableEnvironmentITCases

commit 97b553a76f444db17dcbf734cbc043d46b1d9f9c
Author: twalthr 
Date:   2016-09-29T09:22:16Z

Merge aggregating ITCases

commit ed6a67058aa32d621266952f61383d61446464ed
Author: twalthr 
Date:   2016-09-29T09:29:49Z

All batch ITCases use TableProgramsTestBase




> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-09-29 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15532117#comment-15532117
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/2563


> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-09-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15530087#comment-15530087
 ] 

ASF GitHub Bot commented on FLINK-3656:
---

GitHub user twalthr opened a pull request:

https://github.com/apache/flink/pull/2563

[FLINK-3656] [table] Reduce number of ITCases

This is the first step of reducing the number of ITCases of the Table API. 
This PR modifies the TableProgramsTestBase to test only the default 
configuration in collection execution environment. Other configurations will be 
enabled individually per test.

It reduces the number of tests. From 955 to 403 tests.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/twalthr/flink FLINK-3656

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2563.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2563


commit e13354a03fd6a1207a53894f2f58739326d804e6
Author: twalthr 
Date:   2016-09-28T15:23:18Z

[FLINK-3656] [table] Reduce number of ITCases




> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>  Labels: starter
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3656) Rework Table API tests

2016-06-23 Thread Timo Walther (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15346211#comment-15346211
 ] 

Timo Walther commented on FLINK-3656:
-

Many tests can be converted in tests extending {{ExpressionTestBase}}.

> Rework Table API tests
> --
>
> Key: FLINK-3656
> URL: https://issues.apache.org/jira/browse/FLINK-3656
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>
> The Table API tests are very inefficient. At the moment It is mostly 
> end-to-end integration tests, often testing the same functionality several 
> times (Java/Scala, DataSet/DataStream).
> We should look into how we can rework the Table API tests such that:
> - long-running integration tests are converted into faster unit tests
> - common parts of DataSet and DataStream are only tested once
> - common parts of Java and Scala Table APIs are only tested once
> - duplicate tests are completely removed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)