Hi, I think Chesnay's suggestion to double-check the bundle makes sense. Additionally, I'd try flink-connector-jdbc_2.12 instead of flink-connector-jdbc_2.11.
Regards, Roman On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <[email protected]> wrote: > > I would try double-checking whether the jdbc connector was truly bundled > in your jar, specifically whether > org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is. > > I can't think of a reason why this shouldn't work for the JDBC connector. > > On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote: > > Hi Chesnay, > > > > How do you ensure that the connector is actually available at runtime? > > > > We are providing below mentioned dependency inside pom.xml with scope > > compile that will be available in class path and it was there in my fink > > job bundled jar. Same we are doing the same for other connector say kafka > > it worked for that > > > > <dependency> > > <groupId>org.apache.flink</groupId> > > <artifactId>flink-connector-jdbc_2.11</artifactId> > > <version>1.14.2</version> > > </dependency> > > <dependency> > > <groupId>mysql</groupId> > > <artifactId>mysql-connector-java</artifactId> > > <version>5.1.41</version> > > </dependency> > > > > Are you bundling it in a jar or putting it into Flinks lib directory? > > Yes we are building jar it is bundled with that but still we saw this error > > . So we tried the workaround which is mentioned in some article to put > > inside a flink lib directory then it worked > > https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this > > is extra stuff which we have to do to make it work with restart of cluster . > > > > But the question is how it worked for kafka and not for jdbc ? I didn't put > > kafka jar explicitly in flink lib folder > > > > Note : I am using flink release 1.14 version for all my job execution / > > implementation which is a stable version I guess > > > > Thanks > > Ronak Beejawat > > From: Chesnay Schepler <[email protected]<mailto:[email protected]>> > > Date: Tuesday, 11 January 2022 at 7:45 PM > > To: Ronak Beejawat (rbeejawa) > > <[email protected]<mailto:[email protected]>>, > > [email protected]<mailto:[email protected]> > > <[email protected]<mailto:[email protected]>> > > Cc: Hang Ruan <[email protected]<mailto:[email protected]>>, > > Shrinath Shenoy K (sshenoyk) > > <[email protected]<mailto:[email protected]>>, Karthikeyan Muthusamy > > (karmuthu) <[email protected]<mailto:[email protected]>>, Krishna > > Singitam (ksingita) <[email protected]<mailto:[email protected]>>, Arun > > Yadav (aruny) <[email protected]<mailto:[email protected]>>, Jayaprakash > > Kuravatti (jkuravat) <[email protected]<mailto:[email protected]>>, Avi > > Sanwal (asanwal) <[email protected]<mailto:[email protected]>> > > Subject: Re: Could not find any factory for identifier 'jdbc' > > How do you ensure that the connector is actually available at runtime? > > Are you bundling it in a jar or putting it into Flinks lib directory? > > > > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote: > >> Correcting subject -> Could not find any factory for identifier 'jdbc' > >> > >> From: Ronak Beejawat (rbeejawa) > >> Sent: Tuesday, January 11, 2022 6:43 PM > >> To: '[email protected]' > >> <[email protected]<mailto:[email protected]>>; > >> '[email protected]' > >> <[email protected]<mailto:[email protected]>>; > >> '[email protected]' > >> <[email protected]<mailto:[email protected]>> > >> Cc: 'Hang Ruan' <[email protected]<mailto:[email protected]>>; > >> Shrinath Shenoy K (sshenoyk) > >> <[email protected]<mailto:[email protected]>>; Karthikeyan Muthusamy > >> (karmuthu) <[email protected]<mailto:[email protected]>>; Krishna > >> Singitam (ksingita) <[email protected]<mailto:[email protected]>>; Arun > >> Yadav (aruny) <[email protected]<mailto:[email protected]>>; Jayaprakash > >> Kuravatti (jkuravat) <[email protected]<mailto:[email protected]>>; Avi > >> Sanwal (asanwal) <[email protected]<mailto:[email protected]>> > >> Subject: what is efficient way to write Left join in flink > >> > >> Hi Team, > >> > >> Getting below exception while using jdbc connector : > >> > >> Caused by: org.apache.flink.table.api.ValidationException: Could not find > >> any factory for identifier 'jdbc' that implements > >> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath. > >> > >> Available factory identifiers are: > >> > >> blackhole > >> datagen > >> filesystem > >> kafka > >> print > >> upsert-kafka > >> > >> > >> I have already added dependency for jdbc connector in pom.xml as mentioned > >> below: > >> > >> <dependency> > >> <groupId>org.apache.flink</groupId> > >> <artifactId>flink-connector-jdbc_2.11</artifactId> > >> <version>1.14.2</version> > >> </dependency> > >> <dependency> > >> <groupId>mysql</groupId> > >> <artifactId>mysql-connector-java</artifactId> > >> <version>5.1.41</version> > >> </dependency> > >> > >> Referred release doc link for the same > >> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/ > >> > >> > >> > >> Please help me on this and provide the solution for it !!! > >> > >> > >> Thanks > >> Ronak Beejawat > >
