Github user alpinegizmo commented on a diff in the pull request:
https://github.com/apache/flink/pull/4013#discussion_r119360512
--- Diff: docs/dev/tableApi.md ---
@@ -25,32 +25,16 @@ specific language governing permissions and limitations
under the License.
-->
-**Table API and SQL are experimental features**
+Apache Flink features two relational APIs - the Table API and SQL - for
unified stream and batch processing. The Table API is a language-integrated
query API for Scala and Java that allows to compose queries from relational
operators such as selection, filter, and join in a very intuitive way. Flink's
SQL support is based on [Apache Calcite](https://calcite.apache.org) which
implements the SQL standard. Queries specified in either interface have the
same semantics and specify the same result regardless whether the input is a
batch input (DataSet) or a stream input (DataStream).
-The Table API is a SQL-like expression language for relational stream and
batch processing that can be easily embedded in Flink's DataSet and DataStream
APIs (Java and Scala).
-The Table API and SQL interface operate on a relational `Table`
abstraction, which can be created from external data sources, or existing
DataSets and DataStreams. With the Table API, you can apply relational
operators such as selection, aggregation, and joins on `Table`s.
+The Table API and the SQL interfaces are tightly integrated with each
other as well as Flink's DataStream and DataSet APIs. You can easily switch
between all APIs and libraries which build upon the APIs. For instance, you can
extract patterns from a DataStream using the [CEP library]({{ site.baseurl
}}/dev/libs/cep.html) and later use the Table API to analyze the patterns or
you scan, filter, and aggregate a batch table using a SQL query before running
a [Gelly graph algorithm]({{ site.baseurl }}/dev/libs/gelly) on the
preprocessed data.
-`Table`s can also be queried with regular SQL, as long as they are
registered (see [Registering Tables](#registering-tables)). The Table API and
SQL offer equivalent functionality and can be mixed in the same program. When a
`Table` is converted back into a `DataSet` or `DataStream`, the logical plan,
which was defined by relational operators and SQL queries, is optimized using
[Apache Calcite](https://calcite.apache.org/) and transformed into a `DataSet`
or `DataStream` program.
-
-**TODO: Check, update, and add**
-
-* What are the Table API / SQL
- * Relational APIs
- * Unified APIs for batch and streaming
- * Semantics are the same
- * But not all operations can be efficiently mapped to streams
- * Table API: language-integrated queries (LINQ) in Scala and Java
- * SQL: Standard SQL
-
-**Please notice: Not all operations are supported by all four combinations
of Stream/Batch and TableAPI/SQL.**
-
-* This will be replaced by the TOC
-{:toc}
+**Please note that the Table API and SQL are not feature complete yet and
being active developed. Not all operations are supported by each combination of
\[Table API, SQL\] and \[stream, batch\] input.**
Setup
-----
-The Table API and SQL are part of the *flink-table* Maven project.
+The Table API and SQL are bundled the `flink-table` Maven artifact.
--- End diff --
... bundled in ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---