Hi,  Soheil Pourbafrani

For the current implementation of JDBCInputFormat, it cannot automatically 
infer the column types. As far as I know, there also is no other way to do this.


If you're going to implement such an input format, the inference work needs to 
be done by yourself. Because it relies on the interfaces provided by JDBC to 
get the table schema or the resultset metadata, rather than the Flink 
interfaces.


Best,
Haibo

At 2019-07-04 03:20:30, "Soheil Pourbafrani" <soheil.i...@gmail.com> wrote:

Hi,


I use the following sample code to load data from a database into Flink DataSet:


DataSet<Row>dbData=env.createInput(JDBCInputFormat.buildJDBCInputFormat().setDrivername("org.apache.derby.jdbc.EmbeddedDriver").setDBUrl("jdbc:derby:memory:persons").setQuery("select
 name, age from 
persons").setRowTypeInfo(newRowTypeInfo(BasicTypeInfo.STRING_TYPE_INFO,BasicTypeInfo.INT_TYPE_INFO)).finish());


But the problem is sometimes there will be tables with many columns so we need 
a more straightforward way to introduce the type of the columns to Flink.
So my question is, is there any way to Flink infer the column type so no need 
to announce each column type one by one?
Does Flink provide some way to implement such inputformat?

Reply via email to