Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-05-30 Thread via GitHub
sivakanthavel-tigeranalytics commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2138905817 Hello @matepek @ajantha-bhat , Need some suggestions ! I am using org.apache.iceberg.spark.SparkSessionCatalog instead of SparkCatalog. I am able

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-04-05 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2040974202 Thank you for the answers. > fundamentally that issue is the same Yes I think I understand that. What I'm surprised that why for that table creation call stack I

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-04-03 Thread via GitHub
ajantha-bhat commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2034277214 Namespace has to be created explicitly in Nessie as described in https://projectnessie.org/blog/namespace-enforcement/ -- This is an automated message from the Apache Git

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-04-03 Thread via GitHub
nastra commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2034239933 @matepek fundamentally that issue is the same as I described in https://github.com/apache/iceberg/issues/10003#issuecomment-2007780751. `SparkSessionCatalog` doesn't create a

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-30 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2027977162 Not strictly related but I'm kinda stuck with this: Using SparkSessionCatalog with NessieCatalog I cannot create iceberg table: ``` create or replace table

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-30 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2027948348 I'm working on something like [this](https://github.com/apache/iceberg/compare/main...matepek:iceberg:main). It would fixes

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-20 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2009415954 Okay, for DBT sadly I need the SparkSessionCatalog as I suspected before. Tried almost everything, it's a pain otherwise. We had been using a rest catalog so I'm surprised we

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-20 Thread via GitHub
nastra commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2008952584 > `spark.sql.catalog.spark_catalog.type` was configured to `jdbc` which was actually a mistake of mine. Yes exactly that's what I meant. That would use the JDBC catalog

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-19 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2008666705 I see what you meant now.. `spark.sql.catalog.spark_catalog.type` was configured to `jdbc` which was actually a mistake of mine. But not defining the `spark_catalog`

Re: [I] SparkSessionCatalog with JDBC catalog: SHOW TABLES IN ... returns error but table exists in JDBC catalog [iceberg]

2024-03-19 Thread via GitHub
matepek commented on issue #10003: URL: https://github.com/apache/iceberg/issues/10003#issuecomment-2008652290 What do you mean by that I'm using JDBC catalog? I thought `spark.sql.catalogImplementation = hive` sets it to hive catalog. (I know I have a knowledge gap and I'm trying