Github user rmetzger commented on a diff in the pull request:

    https://github.com/apache/flink/pull/503#discussion_r26975057
  
    --- Diff: docs/linq.md ---
    @@ -23,58 +23,91 @@ under the License.
     * This will be replaced by the TOC
     {:toc}
     
    -**Language-Integrated Queries are an experimental feature and can 
currently only be used with
    -the Scala API**
    +**Language-Integrated Queries are an experimental feature**
     
    -Flink provides an API that allows specifying operations using SQL-like 
expressions.
    -This Expression API can be enabled by importing
    -`org.apache.flink.api.scala.expressions._`.  This enables implicit 
conversions that allow
    -converting a `DataSet` or `DataStream` to an `ExpressionOperation` on 
which relational queries
    -can be specified. This example shows how a `DataSet` can be converted, how 
expression operations
    -can be specified and how an expression operation can be converted back to 
a `DataSet`:
    +Flink provides an API that allows specifying operations using SQL-like 
expressions. Instead of 
    +manipulating `DataSet` or `DataStream` you work with `Table` on which 
relational operations can
    +be performed. 
    +
    +The following dependency must be added to your project when using the 
Table API:
    +
    +{% highlight xml %}
    +<dependency>
    +  <groupId>org.apache.flink</groupId>
    +  <artifactId>flink-table</artifactId>
    +  <version>{{site.FLINK_VERSION_SHORT }}</version>
    +</dependency>
    +{% endhighlight %}
    +
    +## Scala Table API
    + 
    +The Table API can be enabled by importing 
`org.apache.flink.api.scala.table._`.  This enables
    +implicit conversions that allow
    +converting a DataSet or DataStream to a Table. This example shows how a 
DataSet can
    +be converted, how relationa queries can be specified and how a Table can be
    +converted back to a DataSet`:
     
     {% highlight scala %}
     import org.apache.flink.api.scala._
    -import org.apache.flink.api.scala.expressions._ 
    +import org.apache.flink.api.scala.table._ 
     
     case class WC(word: String, count: Int)
     val input = env.fromElements(WC("hello", 1), WC("hello", 1), WC("ciao", 1))
    -val expr = input.toExpression
    -val result = expr.groupBy('word).select('word, 'count.sum).as[WC]
    +val expr = input.toTable
    +val result = expr.groupBy('word).select('word, 'count.sum).toSet[WC]
     {% endhighlight %}
     
     The expression DSL uses Scala symbols to refer to field names and we use 
code generation to
     transform expressions to efficient runtime code. Please not that the 
conversion to and from
    -expression operations only works when using Scala case classes or Flink 
POJOs. Please check out
    +Tables only works when using Scala case classes or Flink POJOs. Please 
check out
     the [programming guide](programming_guide.html) to learn the requirements 
for a class to be 
     considered a POJO.
      
     This is another example that shows how you
    -can join to operations:
    +can join to Tables:
     
     {% highlight scala %}
     case class MyResult(a: String, b: Int)
     
     val input1 = env.fromElements(...).as('a, 'b)
     val input2 = env.fromElements(...).as('c, 'd)
    -val joined = input1.join(input2).where('b == 'a && 'd > 42).select('a, 
'd).as[MyResult]
    +val joined = input1.join(input2).where("b = a && d > 42").select("a, 
d").as[MyResult]
     {% endhighlight %}
     
    -Notice, how a `DataSet` can be converted to an expression operation by 
using `as` and specifying new
    -names for the fields. This can also be used to disambiguate fields before 
a join operation.
    +Notice, how a DataSet can be converted to a Table by using `as` and 
specifying new
    +names for the fields. This can also be used to disambiguate fields before 
a join operation. Also,
    +in this example we see that you can also use Strings to specify relational 
expressions.
     
    -The Expression API can be used with the Streaming API, since we also have 
implicit conversions to
    -and from `DataStream`.
    +Please refer to the Scaladoc (and Javadoc) for a full list of supported 
operations and a
    +description of the expression syntax. 
     
    -The following dependency must be added to your project when using the 
Expression API:
    +## Java Table API
     
    -{% highlight xml %}
    -<dependency>
    -  <groupId>org.apache.flink</groupId>
    -  <artifactId>flink-expressions</artifactId>
    -  <version>{{site.FLINK_VERSION_SHORT }}</version>
    -</dependency>
    +When using Java, Tables can be converted to and from DataSet and 
DataStream using class
    --- End diff --
    
    "using the class"


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to