[beam-site] 01/02: Update SQL doc to match new APIs

2018-03-20 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 7bc10c7e44469d97bd0b1df25ff851882df03c77
Author: akedin 
AuthorDate: Fri Mar 2 11:16:18 2018 -0800

Update SQL doc to match new APIs

BeamSql.querySimple() and queryMulti() were combined into query().
BeamRecord was renamed to Row. Factory methods and builders were added to 
it.
---
 src/documentation/dsls/sql.md | 337 --
 1 file changed, 226 insertions(+), 111 deletions(-)

diff --git a/src/documentation/dsls/sql.md b/src/documentation/dsls/sql.md
index 2f6fafc..a6289dc 100644
--- a/src/documentation/dsls/sql.md
+++ b/src/documentation/dsls/sql.md
@@ -7,104 +7,145 @@ permalink: /documentation/dsls/sql/
 
 # Beam SQL
 
-* TOC
-{:toc}
-
 This page describes the implementation of Beam SQL, and how to simplify a Beam 
pipeline with DSL APIs.
 
 ## 1. Overview {#overview}
 
-SQL is a well-adopted standard to process data with concise syntax. With DSL 
APIs (currently available only in Java), now `PCollection`s can be queried with 
standard SQL statements, like a regular table. The DSL APIs leverage [Apache 
Calcite](http://calcite.apache.org/) to parse and optimize SQL queries, then 
translate into a composite Beam `PTransform`. In this way, both SQL and normal 
Beam `PTransform`s can be mixed in the same pipeline.
+SQL is a well-adopted standard to process data with concise syntax. With DSL 
APIs (currently available only in Java), now `PCollections` can be queried with 
standard SQL statements, like a regular table. The DSL APIs leverage [Apache 
Calcite](http://calcite.apache.org/) to parse and optimize SQL queries, then 
translate into a composite Beam `PTransform`. In this way, both SQL and normal 
Beam `PTransforms` can be mixed in the same pipeline.
 
 There are two main pieces to the SQL DSL API:
 
-* [BeamRecord]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest }}/index.html?org/apache/beam/sdk/values/BeamRecord.html): 
a new data type used to define composite records (i.e., rows) that consist of 
multiple, named columns of primitive data types. All SQL DSL queries must be 
made against collections of type `PCollection`. Note that 
`BeamRecord` itself is not SQL-specific, however, and may also be used in 
pipelines that do not utilize SQL.
-* [BeamSql]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/BeamSql.html): the interface 
for creating `PTransforms` from SQL queries.
+* [BeamSql]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/BeamSql.html): the interface 
for creating `PTransforms` from SQL queries;
+* [Row]({{ site.baseurl }}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/Row.html) contains named columns with 
corresponding data types. Beam SQL queries can be made only against collections 
of type `PCollection`;
 
 We'll look at each of these below.
 
 ## 2. Usage of DSL APIs {#usage}
 
-### BeamRecord
+### Row
 
-Before applying a SQL query to a `PCollection`, the data in the collection 
must be in `BeamRecord` format. A `BeamRecord` represents a single, immutable 
row in a Beam SQL `PCollection`. The names and types of the fields/columns in 
the record are defined by its associated [BeamRecordType]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/BeamRecordType.html); for SQL queries, 
you should use the [BeamRecordSqlType]({{ site.baseurl [...]
+Before applying a SQL query to a `PCollection`, the data in the collection 
must be in `Row` format. A `Row` represents a single, immutable record in a 
Beam SQL `PCollection`. The names and types of the fields/columns in the row 
are defined by its associated [RowType]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/RowType.html).
+For SQL queries, you should use the [RowSqlType.builder()]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/RowSqlType.html) to create 
`RowTypes`, it allows creating schemas with all supported SQL types (see [Data 
Types](#data-types) for more details on supported primitive data types).
 
 
-A `PCollection` can be created explicitly or implicitly:
+A `PCollection` can be obtained multiple ways, for example:
 
-Explicitly:
-  * **From in-memory data** (typically for unit testing). In this case, the 
record type and coder must be specified explicitly:
-```
+  * **From in-memory data** (typically for unit testing).
+
+**Note:** you have to explicitly specify the `Row` coder. In this example 
we're doing it by calling `Create.of(..).withCoder()`:
+
+

[beam-site] 01/02: Update SQL doc to match new APIs

2018-03-09 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 315acb1b25fd573d126160409284dd7ec46750cd
Author: akedin 
AuthorDate: Fri Mar 2 11:16:18 2018 -0800

Update SQL doc to match new APIs

BeamSql.querySimple() and queryMulti() were combined into query().
BeamRecord was renamed to Row. Factory methods and builders were added to 
it.
---
 src/documentation/dsls/sql.md | 337 --
 1 file changed, 226 insertions(+), 111 deletions(-)

diff --git a/src/documentation/dsls/sql.md b/src/documentation/dsls/sql.md
index 2f6fafc..a6289dc 100644
--- a/src/documentation/dsls/sql.md
+++ b/src/documentation/dsls/sql.md
@@ -7,104 +7,145 @@ permalink: /documentation/dsls/sql/
 
 # Beam SQL
 
-* TOC
-{:toc}
-
 This page describes the implementation of Beam SQL, and how to simplify a Beam 
pipeline with DSL APIs.
 
 ## 1. Overview {#overview}
 
-SQL is a well-adopted standard to process data with concise syntax. With DSL 
APIs (currently available only in Java), now `PCollection`s can be queried with 
standard SQL statements, like a regular table. The DSL APIs leverage [Apache 
Calcite](http://calcite.apache.org/) to parse and optimize SQL queries, then 
translate into a composite Beam `PTransform`. In this way, both SQL and normal 
Beam `PTransform`s can be mixed in the same pipeline.
+SQL is a well-adopted standard to process data with concise syntax. With DSL 
APIs (currently available only in Java), now `PCollections` can be queried with 
standard SQL statements, like a regular table. The DSL APIs leverage [Apache 
Calcite](http://calcite.apache.org/) to parse and optimize SQL queries, then 
translate into a composite Beam `PTransform`. In this way, both SQL and normal 
Beam `PTransforms` can be mixed in the same pipeline.
 
 There are two main pieces to the SQL DSL API:
 
-* [BeamRecord]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest }}/index.html?org/apache/beam/sdk/values/BeamRecord.html): 
a new data type used to define composite records (i.e., rows) that consist of 
multiple, named columns of primitive data types. All SQL DSL queries must be 
made against collections of type `PCollection`. Note that 
`BeamRecord` itself is not SQL-specific, however, and may also be used in 
pipelines that do not utilize SQL.
-* [BeamSql]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/BeamSql.html): the interface 
for creating `PTransforms` from SQL queries.
+* [BeamSql]({{ site.baseurl }}/documentation/sdks/javadoc/{{ 
site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/BeamSql.html): the interface 
for creating `PTransforms` from SQL queries;
+* [Row]({{ site.baseurl }}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/Row.html) contains named columns with 
corresponding data types. Beam SQL queries can be made only against collections 
of type `PCollection`;
 
 We'll look at each of these below.
 
 ## 2. Usage of DSL APIs {#usage}
 
-### BeamRecord
+### Row
 
-Before applying a SQL query to a `PCollection`, the data in the collection 
must be in `BeamRecord` format. A `BeamRecord` represents a single, immutable 
row in a Beam SQL `PCollection`. The names and types of the fields/columns in 
the record are defined by its associated [BeamRecordType]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/BeamRecordType.html); for SQL queries, 
you should use the [BeamRecordSqlType]({{ site.baseurl [...]
+Before applying a SQL query to a `PCollection`, the data in the collection 
must be in `Row` format. A `Row` represents a single, immutable record in a 
Beam SQL `PCollection`. The names and types of the fields/columns in the row 
are defined by its associated [RowType]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/values/RowType.html).
+For SQL queries, you should use the [RowSqlType.builder()]({{ site.baseurl 
}}/documentation/sdks/javadoc/{{ site.release_latest 
}}/index.html?org/apache/beam/sdk/extensions/sql/RowSqlType.html) to create 
`RowTypes`, it allows creating schemas with all supported SQL types (see [Data 
Types](#data-types) for more details on supported primitive data types).
 
 
-A `PCollection` can be created explicitly or implicitly:
+A `PCollection` can be obtained multiple ways, for example:
 
-Explicitly:
-  * **From in-memory data** (typically for unit testing). In this case, the 
record type and coder must be specified explicitly:
-```
+  * **From in-memory data** (typically for unit testing).
+
+**Note:** you have to explicitly specify the `Row` coder. In this example 
we're doing it by calling `Create.of(..).withCoder()`:
+
+