[ 
https://issues.apache.org/jira/browse/FLINK-26437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17500151#comment-17500151
 ] 

Arindam Bhattacharjee commented on FLINK-26437:
-----------------------------------------------

[~straw] I have added the jar, but I am getting the below exception now - 

java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hadoop.security.UserGroupInformation
    at 
org.apache.flink.runtime.security.modules.HadoopModule.install(HadoopModule.java:67)
    at 
org.apache.flink.runtime.security.SecurityUtils.installModules(SecurityUtils.java:76)
    at 
org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:57)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1131)

 

Also having added the below jars in flink library already - 

 

commons-compiler-3.1.1.jar            hadoop-client-3.3.0.jar
flink-connector-jdbc_2.12-1.13.6.jar        hadoop-client-runtime-3.3.0.jar
flink-csv-1.13.6.jar                hadoop-common-3.3.0.jar
flink-dist_2.12-1.13.6.jar            hadoop-hdfs-3.3.0.jar
flink-json-1.13.6.jar                hadoop-hdfs-client-3.3.0.jar
flink-orc_2.12-1.13.6.jar            kafka-clients-3.1.0.jar
flink-parquet_2.12-1.13.6.jar            log4j-1.2-api-2.17.1.jar
flink-shaded-zookeeper-3.4.14.jar        log4j-api-2.17.1.jar
flink-sql-connector-kafka_2.12-1.13.6.jar    log4j-core-2.17.1.jar
flink-streaming-scala_2.12-1.13.6.jar        log4j-slf4j-impl-2.17.1.jar
flink-table-api-scala-bridge_2.12-1.13.6.jar    mysql-connector-java-8.0.28.jar
flink-table-blink_2.12-1.13.6.jar        parquet-column-1.12.2.jar
flink-table-common-1.13.6.jar            parquet-common-1.12.2.jar
flink-table-planner-blink_2.12-1.13.6.jar    parquet-format-2.9.0.jar
flink-table-planner_2.12-1.13.6.jar        parquet-hadoop-1.12.2.jar
flink-table_2.12-1.13.6.jar            stax2-api-4.0.0.jar
guava-30.1-jre.jar                woodstox-core-6.2.8.jar

 

Please help me. Thanks in advance.

> Cannot discover a connector using option: 'connector'='jdbc'
> ------------------------------------------------------------
>
>                 Key: FLINK-26437
>                 URL: https://issues.apache.org/jira/browse/FLINK-26437
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / API
>    Affects Versions: 1.13.6
>            Reporter: Arindam Bhattacharjee
>            Priority: Major
>              Labels: sql-api, table-api
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Hi Team,
> When I was running SQL in Flink SQL-API, was getting the below error - 
> *Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a 
> connector using option: 'connector'='jdbc'*
>         at 
> org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:467)
>         at 
> org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:441)
>         at 
> org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:167)
>         ... 32 more
> Caused by: org.apache.flink.table.api.ValidationException: Could not find any 
> factory for identifier 'jdbc' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> Available factory identifiers are:
> blackhole
> datagen
> filesystem
> kafka
> print
> upsert-kafka
>         at 
> org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:319)
>         at 
> org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:463)
>         ... 34 more
> ------------------------
>  
> SQL I was using - 
> _CREATE TABLE pvuv_sink (_
>  _dt varchar PRIMARY KEY,_
>  _pv BIGINT,_
>  _uv BIGINT_
> _) WITH (_
>  _'connector' = 'jdbc',_
>  _'url' = 'jdbc:mysql://localhost:3306/flinksql_test',_
>  _'table-name' = 'pvuv_sink',_
>  _'username' = 'root',_
>  _'password' = 'xxxxxx',_
>  _'sink.buffer-flush.max-rows' = '1'_
> _);_



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to