Thanks for the hint! I should have mentioned though that I have to support at 
least one level of nested arrays or maps. The sink connector doesn’t do that, 
so I didn’t bother to look into the setup effort required.


> On 19 Dec 2016, at 20:35, John McClean <[email protected]> wrote:
> 
> I've never done this, but I were to try I'd start by looking at Confluent's 
> JDBC sink connector.
> 
> http://docs.confluent.io/3.1.1/connect/connect-jdbc/docs/index.html
> 
> It's going to have all the moving parts you need. The question will be 
> whether it's too heavy-weight and requires too much setup. Not having used 
> it, I don't know.
> 
> J
> 
> On Mon, Dec 19, 2016 at 10:49 AM, tl <[email protected]> wrote:
> Hi,
> 
> I wrote a small tool[0] that converts network monitoring data from a 
> homegrown format to Avro, Parquet and JSON, using Avro 1.8.1 with the 
> 'specific' API (with code generation).
> 
> Now I need to import the same data into an RDBMS (a Derby derivate) through 
> JDBC. The database schema mimics closely the Avro schemata.
> 
> I wonder if there’s an easy way to hook into the machinery that I already 
> built and make it talk to an RDBMS too. Does this sound reasonable? Has it 
> been done before? Any hints on how to proceed or where to look?
> 
> Thanks,
> Thomas
> 
> 
> 
> .
> [0] https://github.com/tomlurge/converTor
> 




.

Reply via email to