Anandonzy edited a comment on pull request #87: URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-673237162
Hello,Now I use this GAV <!-- https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-kudu --> <dependency> <groupId>org.apache.bahir</groupId> <artifactId>flink-connector-kudu_2.11</artifactId> <version>1.0-csa1.2.0.0</version> <scope>compile</scope> </dependency> to connect kudu. My sql ddl: -- source CREATE TABLE logs_resource ( content VARCHAR ) WITH ( 'connector' = 'kafka', 'topic' = 'test', 'properties.bootstrap.servers' = '0.0.0.0:9092', 'properties.group.id' = 'test_ziyu_flink_sql1', 'scan.startup.mode'='latest-offset', 'format' = 'csv', 'csv.ignore-parse-errors' = 'true', 'csv.allow-comments' = 'true', 'csv.field-delimiter' = ',' ); -- sink CREATE TABLE kudu_sink ( create_day int, itime varchar , ltype varchar , dvid varchar , server_ip varchar ) WITH ( 'connector.type' = 'kudu', 'kudu.masters' = '0.0.0.0', 'kudu.table' = 'impala::kudu_flux.ziyu_test', 'kudu.hash-columns' = 'itime', 'kudu.primary-key-columns' = 'itime,create_day' ); INSERT INTO kudu_sink(create_day,itime,ltype,dvid,server_ip) SELECT create_day, itime, ltype , dvid , server_ip FROM logs_resource ,lateral table(udtfOneColumnToMultiColumn(ParseLogUDF(content))) as T(create_day,itime,ltype,dvid,server_ip) GROUP BY itime,ltype,dvid,server_ip,create_day; udtfOneColumnToMultiColumn and ParseLogUDF is my udf function to parse my log. My flink version is 1.11.1 . when I run this sql . no Exception ,but it is no date insert into my kudu table. What I shuold do,Dou you have some example to insert into kudu? Thanks. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
