Hi devs,

I got another report regarding configuring v2 session catalog, when Spark
fails to instantiate the configured catalog. For now, it just simply logs
error message without exception information, and silently uses the default
session catalog.

https://github.com/apache/spark/blob/3819d39607392aa968595e3d97b84fedf83d08d9/sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogManager.scala#L75-L95

IMO, as the user intentionally provides the session catalog, it shouldn't
fail back and just throw the exception. Otherwise (if we still want to do
the failback), we need to add the exception information in the error log
message at least.

Would like to hear the voices.

Thanks,
Jungtaek Lim (HeartSaVioR)

Reply via email to