fhueske commented on a change in pull request #10078: [FLINK-14486][table-api, 
docs] Update documentation regarding Temporary Objects
URL: https://github.com/apache/flink/pull/10078#discussion_r343197807
 
 

 ##########
 File path: docs/dev/table/common.md
 ##########
 @@ -292,25 +292,50 @@ b_b_t_env = 
BatchTableEnvironment.create(environment_settings=b_b_settings)
 
 **Note:** If there is only one planner jar in `/lib` directory, you can use 
`useAnyPlanner` (`use_any_planner` for python) to create specific 
`EnvironmentSettings`.
 
-
 {% top %}
 
-Register Tables in the Catalog
+Create Tables in the Catalog
 -------------------------------
 
-A `TableEnvironment` maintains a catalog of tables which are registered by 
name. There are two types of tables, *input tables* and *output tables*. Input 
tables can be referenced in Table API and SQL queries and provide input data. 
Output tables can be used to emit the result of a Table API or SQL query to an 
external system.
+A `TableEnvironment` maintains a map of catalogs of tables which are created 
with an identifier. Each
+identifier consists of 3 parts: catalog name, database name and object name. 
If a catalog or database is not
+specified, the current default value will be used (see examples in the [Table 
identifier expanding](#table-identifier-expanding) section).
+
+Tables can be either virtual (`VIEWS`) or regular (`TABLES`). `VIEWS` can be 
created from an
+existing `Table` object, usually the result of a Table API or SQL query. 
`TABLES` describe
+external data, such as a file, database, or messaging system.
+
+### Temporary vs permanent tables.
+
 
-An input table can be registered from various sources:
 
-* an existing `Table` object, usually the result of a Table API or SQL query.
-* a `TableSource`, which accesses external data, such as a file, database, or 
messaging system. 
-* a `DataStream` or `DataSet` from a DataStream (only for stream job) or 
DataSet (only for batch job translated from old planner) program. Registering a 
`DataStream` or `DataSet` is discussed in the [Integration with DataStream and 
DataSet API](#integration-with-datastream-and-dataset-api) section.
+Tables may either be temporary, and tied to the lifecycle of a single Flink 
session, or permanent,
+and visible across multiple Flink sessions and clusters.
 
-An output table can be registered using a `TableSink`.
+Permanent tables require a [catalog]({{ site.baseurl 
}}/dev/table/catalogs.html) (such as Hive)
 
 Review comment:
   "Hive" -> "Hive Metastore"

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to