[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15814723#comment-15814723 ] ASF GitHub Bot commented on FLINK-5084: --- Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/2977 > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15814641#comment-15814641 ] ASF GitHub Bot commented on FLINK-5084: --- Github user twalthr commented on the issue: https://github.com/apache/flink/pull/2977 I will have a final look over the code and merge this. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15804244#comment-15804244 ] ASF GitHub Bot commented on FLINK-5084: --- Github user mtunique commented on the issue: https://github.com/apache/flink/pull/2977 @fhueske Done. Thanks. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15804138#comment-15804138 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on the issue: https://github.com/apache/flink/pull/2977 Thanks for the update and rebasing @mtunique! PR is good to merge (one file should be moved, see comment above). > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15804130#comment-15804130 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r94921927 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/api/scala/batch/table/SetOperatorsValidationTest.scala --- @@ -0,0 +1,119 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.table.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.table.api.scala._ +import org.apache.flink.table.api.{TableEnvironment, ValidationException} +import org.junit._ + +class SetOperatorsValidationTest { --- End diff -- This class should be moved into the `validation` package. Can be done before merging > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15803228#comment-15803228 ] ASF GitHub Bot commented on FLINK-5084: --- Github user mtunique commented on the issue: https://github.com/apache/flink/pull/2977 @fhueske I have finished all about comments and rebase. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15802831#comment-15802831 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r94871520 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SortValidationTest.scala --- @@ -0,0 +1,52 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.scala.{ExecutionEnvironment, _} +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ + +class SortValidationTest( + mode: TestExecutionMode, + configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- `TableProgramsTestBase` should be removed. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15802830#comment-15802830 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r94872519 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SortValidationTest.scala --- @@ -0,0 +1,52 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.scala.{ExecutionEnvironment, _} +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ + +class SortValidationTest( + mode: TestExecutionMode, + configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- `TablePRogramsTestBase` should be removed. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15802829#comment-15802829 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r94872463 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcITCase.scala --- @@ -109,36 +110,6 @@ class CalcITCase( TestBaseUtils.compareResultAsText(results.asJava, expected) } - @Test(expected = classOf[ValidationException]) --- End diff -- The test method `testAliasStarException()` should be moved to `CalcValidationTest` and be split up into individual methods. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15772501#comment-15772501 ] ASF GitHub Bot commented on FLINK-5084: --- Github user mtunique commented on the issue: https://github.com/apache/flink/pull/2977 Hi @fhueske , I have fixed the comments. Maybe you can review the changes, then I rebase the master branch and refactor the directory structure.Maybe this can be easy to review by two steps? > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767311#comment-15767311 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93437895 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/AggregationsPlanTest.scala --- @@ -0,0 +1,350 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.common.operators.Order +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767308#comment-15767308 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440307 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SortValidationTest.scala --- @@ -0,0 +1,55 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.scala.{ExecutionEnvironment, _} +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class SortValidationTest( + mode: TestExecutionMode, + configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767309#comment-15767309 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440355 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcValidationTest.scala --- @@ -0,0 +1,111 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class CalcValidationTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767300#comment-15767300 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93437942 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/AggregationsValidationTest.scala --- @@ -0,0 +1,148 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767307#comment-15767307 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93436751 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcPlanTest.scala --- @@ -0,0 +1,394 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import java.sql.{Date, Time, Timestamp} + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.expressions.Literal +import org.apache.flink.api.table._ +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class CalcPlanTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- `CalcPlanTest` should not extend a class. `TableProgramsTestBase` starts a Flink Minicluster with is quite expensive and only required for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767306#comment-15767306 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440247 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/JoinPlanTest.scala --- @@ -0,0 +1,283 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.expressions.Literal +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class JoinPlanTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767297#comment-15767297 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438092 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/JoinPlanTest.scala --- @@ -0,0 +1,283 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.expressions.Literal +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767315#comment-15767315 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438204 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SortValidationTest.scala --- @@ -0,0 +1,55 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.scala.{ExecutionEnvironment, _} +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767303#comment-15767303 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440220 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CastingPlanTest.scala --- @@ -0,0 +1,130 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.api.table.Types._ +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class CastingPlanTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767305#comment-15767305 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438663 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcITCase.scala --- @@ -424,6 +425,19 @@ class CalcITCase( val results = t.toDataSet[Row].collect() TestBaseUtils.compareResultAsText(results.asJava, expected) } + --- End diff -- Please remove the tests which have been moved to `CalcValidationTest` from this file. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767301#comment-15767301 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93437996 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcValidationTest.scala --- @@ -0,0 +1,111 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767314#comment-15767314 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93439887 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/AggregationsPlanTest.scala --- @@ -0,0 +1,350 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.common.operators.Order +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class AggregationsPlanTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase` or any other class. The class does also not need any constructor parameters. You can create a `TableEnvironment` also without a `TableConfig`: `TableEnvironment.getTableEnvironment(env)`. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767310#comment-15767310 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440071 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/AggregationsValidationTest.scala --- @@ -0,0 +1,148 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class AggregationsValidationTest( +mode: TestExecutionMode, +configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase` or any other class. The class does also not need any constructor parameters. You can create a `TableEnvironment` also without a `TableConfig`: `TableEnvironment.getTableEnvironment(env, config)`. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767312#comment-15767312 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438060 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CastingPlanTest.scala --- @@ -0,0 +1,130 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.table.TableEnvironment +import org.apache.flink.api.table.Types._ +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767298#comment-15767298 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438117 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/JoinValidationTest.scala --- @@ -0,0 +1,197 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, TableException, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767313#comment-15767313 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93438154 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SetOperatorsValidationTest.scala --- @@ -0,0 +1,128 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767299#comment-15767299 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93448930 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/AggregationsITCase.scala --- @@ -400,5 +342,23 @@ class AggregationsITCase( TestBaseUtils.compareResultAsText(results.asJava, expected) } + @Test + def testPojoGrouping() { --- End diff -- This test is not testing a Table API aggregation or grouping. I think it can be removed and the `MyPojo` class as well. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767316#comment-15767316 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93436582 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/CalcPlanTest.scala --- @@ -0,0 +1,394 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import java.sql.{Date, Time, Timestamp} + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.{LogicalPlanFormatUtils, TableProgramsTestBase} +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.expressions.Literal +import org.apache.flink.api.table._ +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) --- End diff -- Remove `@RunWith` annotation > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767302#comment-15767302 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440266 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/JoinValidationTest.scala --- @@ -0,0 +1,197 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, TableException, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class JoinValidationTest( + mode: TestExecutionMode, + configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15767304#comment-15767304 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/2977#discussion_r93440286 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SetOperatorsValidationTest.scala --- @@ -0,0 +1,128 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.api.scala.batch.table + +import org.apache.flink.api.scala._ +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode +import org.apache.flink.api.scala.table._ +import org.apache.flink.api.scala.util.CollectionDataSets +import org.apache.flink.api.table.{TableEnvironment, ValidationException} +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode +import org.junit._ +import org.junit.runner.RunWith +import org.junit.runners.Parameterized + +@RunWith(classOf[Parameterized]) +class SetOperatorsValidationTest( + mode: TestExecutionMode, + configMode: TableConfigMode) + extends TableProgramsTestBase(mode, configMode) { --- End diff -- Do not extend `TableProgramsTestBase`. This is only necessary for ITCases. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15741703#comment-15741703 ] ASF GitHub Bot commented on FLINK-5084: --- Github user mtunique commented on the issue: https://github.com/apache/flink/pull/2977 @fhueske I have finish there tasks: - remove `./test/java/org/apache/flink/api/java/batch/ExplainTest` - All tests in `test/java/org/apache/flink/api/java/batch/table` are implemented in Scala and moved to the ./test/scala/ directory. - Move validation tests into new files. - Add plan tests (some have the java version and some are new). It is time to review these codes. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15737298#comment-15737298 ] ASF GitHub Bot commented on FLINK-5084: --- Github user mtunique commented on the issue: https://github.com/apache/flink/pull/2977 Thanks @fhueske . I agree very much. It is necessary to split the tests into a validation and a plan test class. I will follow the suggestions. > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests
[ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15735776#comment-15735776 ] ASF GitHub Bot commented on FLINK-5084: --- Github user fhueske commented on the issue: https://github.com/apache/flink/pull/2977 Thanks for working on this @mtunique! I would like to suggest the following. All test classes in `./test/java/org/apache/flink/api/java/batch/table` are split into a validation and a plan test class. All tests are implemented in Scala and moved to the `./test/scala/` directory. - The validation tests contain the test methods which check for failing validation. These tests are all unit tests and are named like `CalcValidationTest`. - The plan test compare the logical plans of the string Table API and the expression Table API. These are also unit tests and named like `CalcPlanTest`. I would not merge these tests with the execution tests of the expression Table API but keep them separate. The file `./test/java/org/apache/flink/api/java/batch/ExplainTest` can be removed. It checks exactly the same as the Scala version of this test. What do you think? Best, Fabian > Replace Java Table API integration tests by unit tests > -- > > Key: FLINK-5084 > URL: https://issues.apache.org/jira/browse/FLINK-5084 > Project: Flink > Issue Type: Task > Components: Table API & SQL >Reporter: Fabian Hueske >Priority: Minor > > The Java Table API is a wrapper on top of the Scala Table API. > Instead of operating directly with Expressions like the Scala API, the Java > API accepts a String parameter which is parsed into Expressions. > We could therefore replace the Java Table API ITCases by tests that check > that the parsing step produces a valid logical plan. > This could be done by creating two {{Table}} objects for an identical query > once with the Scala Expression API and one with the Java String API and > comparing the logical plans of both {{Table}} objects. Basically something > like the following: > {code} > val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, > 'c) > val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, > 'g, 'h) > val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g) > val joinT2 = ds1.join(ds2).where("b = e").select("c, g") > val lPlan1 = joinT1.logicalPlan > val lPlan2 = joinT2.logicalPlan > Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)