Hi Roman, Chesnay

PFB screenshot for jdbc connector availability in bundle jar as I mentioned 
earlier it didn't worked even than, so I tried putting it in inside flink lib 
directory as mentioned in below article link then it resolved the issue.



[cid:image001.png@01D80864.522974B0]


[cid:image002.png@01D80864.522974B0]
@Roman - even I tried with flink-connector-jdbc_2.12 it didn't worked .

Thanks
Ronak Beejawat
From: Roman Khachatryan <ro...@apache.org<mailto:ro...@apache.org>>
Date: Wednesday, 12 January 2022 at 6:57 PM
To: commun...@flink.apache.org<mailto:commun...@flink.apache.org> 
<commun...@flink.apache.org<mailto:commun...@flink.apache.org>>
Cc: dev <dev@flink.apache.org<mailto:dev@flink.apache.org>>, Ronak Beejawat 
(rbeejawa) <rbeej...@cisco.com.invalid<mailto:rbeej...@cisco.com.invalid>>, 
u...@flink.apache.org<mailto:u...@flink.apache.org> 
<u...@flink.apache.org<mailto:u...@flink.apache.org>>, Hang Ruan 
<ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com>>, Shrinath Shenoy K 
(sshenoyk) <sshen...@cisco.com<mailto:sshen...@cisco.com>>, Karthikeyan 
Muthusamy (karmuthu) <karmu...@cisco.com<mailto:karmu...@cisco.com>>, Krishna 
Singitam (ksingita) <ksing...@cisco.com<mailto:ksing...@cisco.com>>, Arun Yadav 
(aruny) <ar...@cisco.com<mailto:ar...@cisco.com>>, Jayaprakash Kuravatti 
(jkuravat) <jkura...@cisco.com<mailto:jkura...@cisco.com>>, Avi Sanwal 
(asanwal) <asan...@cisco.com<mailto:asan...@cisco.com>>
Subject: Re: Could not find any factory for identifier 'jdbc'
Hi,

I think Chesnay's suggestion to double-check the bundle makes sense.
Additionally, I'd try flink-connector-jdbc_2.12 instead of
flink-connector-jdbc_2.11.

Regards,
Roman

On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler 
<ches...@apache.org<mailto:ches...@apache.org>> wrote:
>
> I would try double-checking whether the jdbc connector was truly bundled
> in your jar, specifically whether
> org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
>
> I can't think of a reason why this shouldn't work for the JDBC connector.
>
> On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > Hi Chesnay,
> >
> > How do you ensure that the connector is actually available at runtime?
> >
> > We are providing below mentioned dependency inside pom.xml with scope 
> > compile that will be available in class path and it was there in my fink 
> > job bundled jar. Same we are doing the same for other connector say kafka 
> > it worked for that
> >
> > <dependency>
> >                <groupId>org.apache.flink</groupId>
> >                <artifactId>flink-connector-jdbc_2.11</artifactId>
> >                <version>1.14.2</version>
> > </dependency>
> > <dependency>
> >                <groupId>mysql</groupId>
> >                <artifactId>mysql-connector-java</artifactId>
> >                <version>5.1.41</version>
> > </dependency>
> >
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> > Yes we are building jar it is bundled with that but still we saw this error 
> > . So we tried the workaround which is mentioned in some article to put 
> > inside a flink lib directory then it worked 
> > https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this 
> > is extra stuff which we have to do to make it work with restart of cluster .
> >
> > But the question is how it worked for kafka and not for jdbc ? I didn't put 
> > kafka jar explicitly in flink lib folder
> >
> > Note : I am using flink release 1.14 version for all my job execution / 
> > implementation which is a stable version I guess
> >
> > Thanks
> > Ronak Beejawat
> > From: Chesnay Schepler 
> > <ches...@apache.org<mailto:ches...@apache.org<mailto:ches...@apache.org%3cmailto:ches...@apache.org>>>
> > Date: Tuesday, 11 January 2022 at 7:45 PM
> > To: Ronak Beejawat (rbeejawa) 
> > <rbeej...@cisco.com.INVALID<mailto:rbeej...@cisco.com.INVALID<mailto:rbeej...@cisco.com.INVALID%3cmailto:rbeej...@cisco.com.INVALID>>>,
> >  
> > u...@flink.apache.org<mailto:u...@flink.apache.org<mailto:u...@flink.apache.org%3cmailto:u...@flink.apache.org>>
> >  
> > <u...@flink.apache.org<mailto:u...@flink.apache.org<mailto:u...@flink.apache.org%3cmailto:u...@flink.apache.org>>>
> > Cc: Hang Ruan 
> > <ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com%3cmailto:ruanhang1...@gmail.com>>>,
> >  Shrinath Shenoy K (sshenoyk) 
> > <sshen...@cisco.com<mailto:sshen...@cisco.com<mailto:sshen...@cisco.com%3cmailto:sshen...@cisco.com>>>,
> >  Karthikeyan Muthusamy (karmuthu) 
> > <karmu...@cisco.com<mailto:karmu...@cisco.com<mailto:karmu...@cisco.com%3cmailto:karmu...@cisco.com>>>,
> >  Krishna Singitam (ksingita) 
> > <ksing...@cisco.com<mailto:ksing...@cisco.com<mailto:ksing...@cisco.com%3cmailto:ksing...@cisco.com>>>,
> >  Arun Yadav (aruny) 
> > <ar...@cisco.com<mailto:ar...@cisco.com<mailto:ar...@cisco.com%3cmailto:ar...@cisco.com>>>,
> >  Jayaprakash Kuravatti (jkuravat) 
> > <jkura...@cisco.com<mailto:jkura...@cisco.com<mailto:jkura...@cisco.com%3cmailto:jkura...@cisco.com>>>,
> >  Avi Sanwal (asanwal) 
> > <asan...@cisco.com<mailto:asan...@cisco.com<mailto:asan...@cisco.com%3cmailto:asan...@cisco.com>>>
> > Subject: Re: Could not find any factory for identifier 'jdbc'
> > How do you ensure that the connector is actually available at runtime?
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> >
> > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> >> Correcting subject -> Could not find any factory for identifier 'jdbc'
> >>
> >> From: Ronak Beejawat (rbeejawa)
> >> Sent: Tuesday, January 11, 2022 6:43 PM
> >> To: 'dev@flink.apache.org' 
> >> <dev@flink.apache.org<mailto:dev@flink.apache.org<mailto:dev@flink.apache.org%3cmailto:dev@flink.apache.org>>>;
> >>  'commun...@flink.apache.org' 
> >> <commun...@flink.apache.org<mailto:commun...@flink.apache.org<mailto:commun...@flink.apache.org%3cmailto:commun...@flink.apache.org>>>;
> >>  'u...@flink.apache.org' 
> >> <u...@flink.apache.org<mailto:u...@flink.apache.org<mailto:u...@flink.apache.org%3cmailto:u...@flink.apache.org>>>
> >> Cc: 'Hang Ruan' 
> >> <ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com%3cmailto:ruanhang1...@gmail.com>>>;
> >>  Shrinath Shenoy K (sshenoyk) 
> >> <sshen...@cisco.com<mailto:sshen...@cisco.com<mailto:sshen...@cisco.com%3cmailto:sshen...@cisco.com>>>;
> >>  Karthikeyan Muthusamy (karmuthu) 
> >> <karmu...@cisco.com<mailto:karmu...@cisco.com<mailto:karmu...@cisco.com%3cmailto:karmu...@cisco.com>>>;
> >>  Krishna Singitam (ksingita) 
> >> <ksing...@cisco.com<mailto:ksing...@cisco.com<mailto:ksing...@cisco.com%3cmailto:ksing...@cisco.com>>>;
> >>  Arun Yadav (aruny) 
> >> <ar...@cisco.com<mailto:ar...@cisco.com<mailto:ar...@cisco.com%3cmailto:ar...@cisco.com>>>;
> >>  Jayaprakash Kuravatti (jkuravat) 
> >> <jkura...@cisco.com<mailto:jkura...@cisco.com<mailto:jkura...@cisco.com%3cmailto:jkura...@cisco.com>>>;
> >>  Avi Sanwal (asanwal) 
> >> <asan...@cisco.com<mailto:asan...@cisco.com<mailto:asan...@cisco.com%3cmailto:asan...@cisco.com>>>
> >> Subject: what is efficient way to write Left join in flink
> >>
> >> Hi Team,
> >>
> >> Getting below exception while using jdbc connector :
> >>
> >> Caused by: org.apache.flink.table.api.ValidationException: Could not find 
> >> any factory for identifier 'jdbc' that implements 
> >> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> >>
> >> Available factory identifiers are:
> >>
> >> blackhole
> >> datagen
> >> filesystem
> >> kafka
> >> print
> >> upsert-kafka
> >>
> >>
> >> I have already added dependency for jdbc connector in pom.xml as mentioned 
> >> below:
> >>
> >> <dependency>
> >> <groupId>org.apache.flink</groupId>
> >>          <artifactId>flink-connector-jdbc_2.11</artifactId>
> >>          <version>1.14.2</version>
> >> </dependency>
> >> <dependency>
> >> <groupId>mysql</groupId>
> >>          <artifactId>mysql-connector-java</artifactId>
> >>          <version>5.1.41</version>
> >> </dependency>
> >>
> >> Referred release doc link for the same 
> >> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> >>
> >>
> >>
> >> Please help me on this and provide the solution for it !!!
> >>
> >>
> >> Thanks
> >> Ronak Beejawat
>
>

Reply via email to