Hello,

I'm experimenting with exposing a set of mutable objects via Calcite so I
can connect to it and explore/modify the data with tools that speak JDBC
like DataGrip. Thanks for Calcite! It looks like exactly what's needed.

So far I managed to set up an Avatica server that connects to Calcite using
my custom schema factories and tables. And ... queries work! Hurrah. I can
explore the data in DataGrip.

Now I want to implement DML so the data can be edited. Here, the road seems
less travelled. If anyone knows the answers to these questions I'd be
grateful:

1. Are there any examples of anyone doing this? None of the included
adapters seem to support writing to the data source. CALCITE-2748
<https://issues.apache.org/jira/browse/CALCITE-2748> sounds a bit
discouraging. The built-in support only really supports calling addAll() on
a custom collection, or removeAll(). And indeed when I try UPDATE I hit the
assertion.

2. Therefore I think I need to implement my own TableModify sub-class that
implements EnumerableRel, which will then construct a Java AST that will
emit method calls back into my custom Table class. Is that right?

3. To get my custom TableModify to be used I need to write a planner rule
to substitute LogicalTableModify for my custom rel, I think. The docs show
how to write a rule but there's no info on how to register it for use. The
enumerable rules seem to be hard-coded and registered by a fixed code path.
Does anyone know how to register a custom planner rule to get a custom
TableModify rel into use, starting from a JdbcMeta that connects to the
jdbc:calcite: driver using a connection string? I've spent quite a bit of
time rummaging around but there's no obvious path from a connection object
to anything that sounds like a planner rule registry.

4. When generating the code, it seems like the results of the WHERE clause
come in the form of an Enumerable. Is there a way to pass custom data
through the computed results so by the time a set of rows have been
selected I can map them back to the underlying objects the data originally
came from? The SET expressions are going to be translated to setXxx method
calls so I need the pointer. The objects do have unique IDs so, modifying
the query to include that additional column can also work but it needs to
be transparent to the user.

5. The underlying data layer has a notion of its own transactions. I'm
overriding JdbcMeta to implement creating and committing the underlying
transactions. Is that the right place to do it? There's a lot of different
layers in Calcite/Avatica and I'm not 100% sure I'm doing things the right
way.

Any tips much appreciated!

thanks,
-mike

Reply via email to