[jira] [Commented] (SPARK-37648) Spark catalog and Delta tables

2022-11-18 Thread Michael F (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17635913#comment-17635913
 ] 

Michael F commented on SPARK-37648:
---

Any update here? This continues to be an issue in 3.3.1

> Spark catalog and Delta tables
> --
>
> Key: SPARK-37648
> URL: https://issues.apache.org/jira/browse/SPARK-37648
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.1.2
> Environment: Spark version 3.1.2
> Scala version 2.12.10
> Hive version 2.3.7
> Delta version 1.0.0
>Reporter: Hanna Liashchuk
>Priority: Major
>
> I'm using Spark with Delta tables, while tables are created, there are no 
> columns in the table.
> Steps to reproduce:
> 1. Start spark-shell 
> {code:java}
> spark-shell --conf 
> "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf 
> "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
>  --conf "spark.sql.legacy.parquet.int96RebaseModeInWrite=LEGACY"{code}
> 2. Create delta table
> {code:java}
> spark.range(10).write.format("delta").option("path", 
> "tmp/delta").saveAsTable("delta"){code}
> 3. Make sure table exists 
> {code:java}
> spark.catalog.listTables.show{code}
> 4. Find out that columns are not
> {code:java}
> spark.catalog.listColumns("delta").show{code}
> This is critical for Delta integration with different BI tools such as Power 
> BI or Tableau, as they are querying spark catalog for the metadata and we are 
> getting errors that no columns are found. 
> Discussion can be found in Delta repository - 
> https://github.com/delta-io/delta/issues/695



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37648) Spark catalog and Delta tables

2021-12-19 Thread Cheng Pan (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17462361#comment-17462361
 ] 

Cheng Pan commented on SPARK-37648:
---

This issue has been fixed in Apache Kyuubi (Incubating), 
[https://github.com/apache/incubator-kyuubi/pull/1476]

Kyuubi can be considered as a more powerful Spark Thrift Server, it's worth a 
try.

> Spark catalog and Delta tables
> --
>
> Key: SPARK-37648
> URL: https://issues.apache.org/jira/browse/SPARK-37648
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.1.2
> Environment: Spark version 3.1.2
> Scala version 2.12.10
> Hive version 2.3.7
> Delta version 1.0.0
>Reporter: Hanna Liashchuk
>Priority: Major
>
> I'm using Spark with Delta tables, while tables are created, there are no 
> columns in the table.
> Steps to reproduce:
> 1. Start spark-shell 
> {code:java}
> spark-shell --conf 
> "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf 
> "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
>  --conf "spark.sql.legacy.parquet.int96RebaseModeInWrite=LEGACY"{code}
> 2. Create delta table
> {code:java}
> spark.range(10).write.format("delta").option("path", 
> "tmp/delta").saveAsTable("delta"){code}
> 3. Make sure table exists 
> {code:java}
> spark.catalog.listTables.show{code}
> 4. Find out that columns are not
> {code:java}
> spark.catalog.listColumns("delta").show{code}
> This is critical for Delta integration with different BI tools such as Power 
> BI or Tableau, as they are querying spark catalog for the metadata and we are 
> getting errors that no columns are found. 
> Discussion can be found in Delta repository - 
> https://github.com/delta-io/delta/issues/695



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org