[
https://issues.apache.org/jira/browse/FLINK-2828?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15258266#comment-15258266
]
ASF GitHub Bot commented on FLINK-2828:
---------------------------------------
Github user vasia commented on a diff in the pull request:
https://github.com/apache/flink/pull/1939#discussion_r61105940
--- Diff:
flink-libraries/flink-table/src/test/java/org/apache/flink/api/java/table/test/TableSourceITCase.java
---
@@ -0,0 +1,120 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.table.test;
+
+import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.java.DataSet;
+import org.apache.flink.api.java.ExecutionEnvironment;
+import org.apache.flink.api.java.table.BatchTableEnvironment;
+import org.apache.flink.api.scala.table.test.GeneratingInputFormat;
+import org.apache.flink.api.table.Row;
+import org.apache.flink.api.table.Table;
+import org.apache.flink.api.table.TableEnvironment;
+import org.apache.flink.api.table.sources.BatchTableSource;
+import org.apache.flink.api.table.test.utils.TableProgramsTestBase;
+import org.apache.flink.api.table.typeutils.RowTypeInfo;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+
+import java.util.List;
+
+
+@RunWith(Parameterized.class)
+public class TableSourceITCase extends TableProgramsTestBase {
+
+ public TableSourceITCase(TestExecutionMode mode, TableConfigMode
configMode) {
+ super(mode, configMode);
+ }
+
+ @Test
+ public void testStreamTableSourceTableAPI() throws Exception {
+ ExecutionEnvironment env =
ExecutionEnvironment.getExecutionEnvironment();
+ BatchTableEnvironment tableEnv =
TableEnvironment.getTableEnvironment(env, config());
+
+ tableEnv.registerTableSource("MyTable", new
TestBatchTableSource());
+
+ Table result = tableEnv.scan("MyTable")
+ .where("amount < 4")
+ .select("amount * id, name");
+
+ DataSet<Row> resultSet = tableEnv.toDataSet(result, Row.class);
+ List<Row> results = resultSet.collect();
+
+ String expected = "0,Record_0\n" + "0,Record_16\n" +
"0,Record_32\n" + "1,Record_1\n" +
+ "17,Record_17\n" + "36,Record_18\n" + "4,Record_2\n" +
"57,Record_19\n" + "9,Record_3\n";
+
+ compareResultAsText(results, expected);
+ }
+
+ @Test
+ public void testStreamTableSourceSQL() throws Exception {
--- End diff --
-> test*Batch*TableSourceSQL?
> Add interfaces for Table API input formats
> ------------------------------------------
>
> Key: FLINK-2828
> URL: https://issues.apache.org/jira/browse/FLINK-2828
> Project: Flink
> Issue Type: New Feature
> Components: Table API
> Reporter: Timo Walther
> Assignee: Fabian Hueske
>
> In order to support input formats for the Table API, interfaces are
> necessary. I propose two types of TableSources:
> - AdaptiveTableSources can adapt their output to the requirements of the
> plan. Although the output schema stays the same, the TableSource can react on
> field resolution and/or predicates internally and can return adapted
> DataSet/DataStream versions in the "translate" step.
> - StaticTableSources are an easy way to provide the Table API with additional
> input formats without much implementation effort (e.g. for fromCsvFile())
> TableSources need to be deeply integrated into the Table API.
> The TableEnvironment requires a newly introduced AbstractExecutionEnvironment
> (common super class of all ExecutionEnvironments for DataSets and
> DataStreams).
> Here's what a TableSource can see from more complicated queries:
> {code}
> getTableJava(tableSource1)
> .filter("a===5 || a===6")
> .select("a as a4, b as b4, c as c4")
> .filter("b4===7")
> .join(getTableJava(tableSource2))
> .where("a===a4 && c==='Test' && c4==='Test2'")
> // Result predicates for tableSource1:
> // List("a===5 || a===6", "b===7", "c==='Test2'")
> // Result predicates for tableSource2:
> // List("c==='Test'")
> // Result resolved fields for tableSource1 (true = filtering,
> false=selection):
> // Set(("a", true), ("a", false), ("b", true), ("b", false), ("c", false),
> ("c", true))
> // Result resolved fields for tableSource2 (true = filtering,
> false=selection):
> // Set(("a", true), ("c", true))
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)