[ 
https://issues.apache.org/jira/browse/FLINK-3226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15160358#comment-15160358
 ] 

ASF GitHub Bot commented on FLINK-3226:
---------------------------------------

Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/1679#discussion_r53905333
  
    --- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/api/table/test/ScalarFunctionsTest.scala
 ---
    @@ -0,0 +1,96 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one
    + * or more contributor license agreements.  See the NOTICE file
    + * distributed with this work for additional information
    + * regarding copyright ownership.  The ASF licenses this file
    + * to you under the Apache License, Version 2.0 (the
    + * "License"); you may not use this file except in compliance
    + * with the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.flink.api.table.test
    +
    +import org.apache.flink.api.common.typeinfo.BasicTypeInfo._
    +import org.apache.flink.api.common.typeinfo.TypeInformation
    +import org.apache.flink.api.scala.table._
    +import org.apache.flink.api.table.Row
    +import org.apache.flink.api.table.expressions.Expression
    +import org.apache.flink.api.table.parser.ExpressionParser
    +import org.apache.flink.api.table.test.utils.ExpressionEvaluator
    +import org.apache.flink.api.table.typeinfo.RowTypeInfo
    +import org.junit.Assert.assertEquals
    +import org.junit.Test
    +
    +class ScalarFunctionsTest {
    +
    +  @Test
    +  def testSubstring(): Unit = {
    +    testFunction(
    +      'f0.substring(2),
    +      "f0.substring(2)",
    +      "SUBSTRING(f0, 2)",
    +      "his is a test String.")
    +
    +    testFunction(
    +      'f0.substring(2, 5),
    +      "f0.substring(2, 5)",
    +      "SUBSTRING(f0, 2, 5)",
    +      "his i")
    +
    +    testFunction(
    +      'f0.substring(1, 'f7),
    +      "f0.substring(1, f7)",
    +      "SUBSTRING(f0, 1, f7)",
    +      "Thi")
    +  }
    +
    +  // 
----------------------------------------------------------------------------------------------
    +
    +  def testFunction(
    +      expr: Expression,
    +      exprString: String,
    +      sqlExpr: String,
    +      expected: String): Unit = {
    +    val testData = new Row(8)
    +    testData.setField(0, "This is a test String.")
    +    testData.setField(1, true)
    +    testData.setField(2, 42.toByte)
    +    testData.setField(3, 43.toShort)
    +    testData.setField(4, 44.toLong)
    +    testData.setField(5, 4.5.toFloat)
    +    testData.setField(6, 4.6)
    +    testData.setField(7, 3)
    +
    +    val typeInfo = new RowTypeInfo(Seq(
    +      STRING_TYPE_INFO,
    +      BOOLEAN_TYPE_INFO,
    +      BYTE_TYPE_INFO,
    +      SHORT_TYPE_INFO,
    +      LONG_TYPE_INFO,
    +      FLOAT_TYPE_INFO,
    +      DOUBLE_TYPE_INFO,
    +      INT_TYPE_INFO)).asInstanceOf[TypeInformation[Any]]
    +
    +    val exprResult = ExpressionEvaluator.evaluate(testData, typeInfo, expr)
    +    assertEquals(expected, exprResult)
    +
    +    val exprStringResult = ExpressionEvaluator.evaluate(
    +      testData,
    +      typeInfo,
    +      ExpressionParser.parseExpression(exprString))
    +    assertEquals(expected, exprStringResult)
    +
    +    // TODO test SQL expression
    --- End diff --
    
    Once we have a SQL parser ready, I will resolve this TODO ;-)


> Translate optimized logical Table API plans into physical plans representing 
> DataSet programs
> ---------------------------------------------------------------------------------------------
>
>                 Key: FLINK-3226
>                 URL: https://issues.apache.org/jira/browse/FLINK-3226
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API
>            Reporter: Fabian Hueske
>            Assignee: Chengxiang Li
>
> This issue is about translating an (optimized) logical Table API (see 
> FLINK-3225) query plan into a physical plan. The physical plan is a 1-to-1 
> representation of the DataSet program that will be executed. This means:
> - Each Flink RelNode refers to exactly one Flink DataSet or DataStream 
> operator.
> - All (join and grouping) keys of Flink operators are correctly specified.
> - The expressions which are to be executed in user-code are identified.
> - All fields are referenced with their physical execution-time index.
> - Flink type information is available.
> - Optional: Add physical execution hints for joins
> The translation should be the final part of Calcite's optimization process.
> For this task we need to:
> - implement a set of Flink DataSet RelNodes. Each RelNode corresponds to one 
> Flink DataSet operator (Map, Reduce, Join, ...). The RelNodes must hold all 
> relevant operator information (keys, user-code expression, strategy hints, 
> parallelism).
> - implement rules to translate optimized Calcite RelNodes into Flink 
> RelNodes. We start with a straight-forward mapping and later add rules that 
> merge several relational operators into a single Flink operator, e.g., merge 
> a join followed by a filter. Timo implemented some rules for the first SQL 
> implementation which can be used as a starting point.
> - Integrate the translation rules into the Calcite optimization process



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to