[ 
https://issues.apache.org/jira/browse/FLINK-1996?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15268940#comment-15268940
 ] 

ASF GitHub Bot commented on FLINK-1996:
---------------------------------------

Github user yjshen commented on the pull request:

    https://github.com/apache/flink/pull/1961#issuecomment-216575391
  
    Hi @fhueske , I've read through this PR and find a little wired of the 
current API design.
    
    Please correct me if I take something wrong: Since we are output `Table`s, 
the schema is known at runtime, why should we first create a type agnostic 
`TableSink` and then configure it with specific name and types? What about
    ``` scala
    val t: Table = ...
    t.write().format("csv").option("delim", "|").option("path","/path/to/file")
    env.execute()
    ```
    and construct the `TableSink` when we are about to `execute()`? :)


> Add output methods to Table API
> -------------------------------
>
>                 Key: FLINK-1996
>                 URL: https://issues.apache.org/jira/browse/FLINK-1996
>             Project: Flink
>          Issue Type: Improvement
>          Components: Table API
>    Affects Versions: 0.9
>            Reporter: Fabian Hueske
>            Assignee: Fabian Hueske
>
> Tables need to be converted to DataSets (or DataStreams) to write them out. 
> It would be good to have a way to emit Table results directly for example to 
> print, CSV, JDBC, HBase, etc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to