The JdbcOutputFormat was originally meant for batch jobs.
It should be possible to use it for streaming jobs as well, however, you
should be aware that it is not integrated with Flink checkpointing
mechanism.
So, you might have duplicate data in case of failures.

I also don't know if or how well it works with H2.

Best, Fabian

2017-02-16 11:06 GMT+01:00 Punit Tandel <punit.tan...@ericsson.com>:

> Yes  i have been following the tutorials and reading from H2 and writing
> to H2 works fine, But problem here is data coming from kafka and writing
> them to h2 engine does not seems to work and cant see any error thrown
> while writing into in memory H2 database, So couldnt say whats the error
> and why those data are not inserted.
>
> Have been trying to find out cause and looking for logs while flink
> processes the operations but couldnt find any error being thrown at the
> time of writing data. Any where i can check for logs ?
>
> Thanks
>
> On 02/16/2017 01:10 AM, Ted Yu wrote:
>
> See the tutorial at the beginning of:
>
> flink-connectors/flink-jdbc/src/main/java/org/apache/
> flink/api/java/io/jdbc/JDBCInputFormat.java
>
> Looks like plugging in "org.h2.Driver" should do.
>
> On Wed, Feb 15, 2017 at 4:59 PM, Punit Tandel <punit.tan...@ericsson.com>
> wrote:
>
>> Hi All
>>
>> Does flink jdbc support writing the data into H2 Database?
>>
>> Thanks
>> Punit
>>
>>
>
>

Reply via email to