External Catalog is old and you probably shouldn't touch it, it's mostly
for wrapping the old Hive interface.

Almost all connectors now are built using the DSV2 interface either via a
catalog, or directly via DataSourceRegister. I think Datasource register is
mostly good for backwards compatibility (To support spark.read.format.load)
and it makes more sense to build a new TableCatalog implementation
(supporting spark.table and spark.sql("SELECT * from catalog.db.table").
That said you can implement both register and Catalog for a datasource if
you like.

On Mon, Mar 27, 2023 at 11:49 AM Alex Cruise <a...@cluonflux.com> wrote:

> On Fri, Mar 24, 2023 at 11:23 AM Alex Cruise <a...@cluonflux.com> wrote:
>
>> I found ExternalCatalog a few days ago and have been implementing one of
>> those, but it seems like DataSourceRegister / SupportsCatalogOptions is
>> another popular approach. I'm not sure offhand how they overlap/intersect
>> just yet.
>>
>
> I would love it if someone could comment on when implementing
> ExternalCatalog is a good idea, vs. other approaches. :)
>
> -0xe1a
>
>
>>

Reply via email to