[ 
https://issues.apache.org/jira/browse/BAHIR-99?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16097676#comment-16097676
 ] 

ASF GitHub Bot commented on BAHIR-99:
-------------------------------------

Github user rmetzger commented on a diff in the pull request:

    https://github.com/apache/bahir-flink/pull/17#discussion_r128921875
  
    --- Diff: 
flink-connector-kudu/src/main/java/es/accenture/flink/Utils/CreateKuduTable.java
 ---
    @@ -0,0 +1,54 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package es.accenture.flink.Utils;
    +
    +import org.apache.kudu.ColumnSchema;
    +import org.apache.kudu.Schema;
    +import org.apache.kudu.Type;
    +import org.apache.kudu.client.CreateTableOptions;
    +import org.apache.kudu.client.KuduClient;
    +
    +import java.util.ArrayList;
    +import java.util.List;
    +
    +
    +public class CreateKuduTable {
    +    public static void main(String[] args) {
    +
    +        String tableName = ""; // TODO insert table name
    +        String host = "localhost";
    +
    +        KuduClient client = new KuduClient.KuduClientBuilder(host).build();
    +        try {
    +            List<ColumnSchema> columns = new ArrayList(2);
    +            columns.add(new ColumnSchema.ColumnSchemaBuilder("valueInt", 
Type.INT32)
    +                    .key(true)
    +                    .build());
    +            columns.add(new 
ColumnSchema.ColumnSchemaBuilder("valueString", Type.STRING)
    +                    .build());
    +            List<String> rangeKeys = new ArrayList<>();
    +            rangeKeys.add("valueInt");
    +            Schema schema = new Schema(columns);
    +            client.createTable(tableName, schema,
    +                    new 
CreateTableOptions().setRangePartitionColumns(rangeKeys).addHashPartitions(rangeKeys,
 4));
    +            System.out.println("Table \"" + tableName + "\" created 
succesfully");
    +        } catch (Exception e) {
    --- End diff --
    
    I would suggest to just let the main method throw exceptions.


> Kudu connector to read/write from/to Kudu
> -----------------------------------------
>
>                 Key: BAHIR-99
>                 URL: https://issues.apache.org/jira/browse/BAHIR-99
>             Project: Bahir
>          Issue Type: New Feature
>          Components: Flink Streaming Connectors
>    Affects Versions: Flink-1.0
>            Reporter: Rubén Casado
>            Assignee: Rubén Casado
>             Fix For: Flink-Next
>
>
> Java library to integrate Apache Kudu and Apache Flink. Main goal is to be 
> able to read/write data from/to Kudu using the DataSet and DataStream Flink's 
> APIs.
> Data flows patterns:
> Batch
>  - Kudu -> DataSet<RowSerializable> -> Kudu
>  - Kudu -> DataSet<RowSerializable> -> other source
>  - Other source -> DataSet<RowSerializable> -> other source
> Stream
>  - Other source -> DataStream <RowSerializable> -> Kudu
> Code is available in https://github.com/rubencasado/Flink-Kudu



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to