[ 
https://issues.apache.org/jira/browse/SPARK-35356?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

yikf updated SPARK-35356:
-------------------------
    Description: 
In the case of externalCatalog is InMemoryCatalog, Using a different 
SparkSession will report the following error when create the same table , as 
follow
{code:java}
sparkSession.sql("create table if not exists t1 (id Int) using orc")
sparkSession.sql("insert into t1 values(1)")

sparkSession2.sql("create table if not exists t1 (id Int) using orc") // will 
report error as follow
{code}
 
{code:java}
Can not create the managed table('`default`.`t1`'). The associated 
location('file:***/t1') already exists.
{code}

  was:
In the case of externalCatalog is InMemoryCatalog, Using a different 
SparkSession will report the following error when create the same table , as 
follow:
{code:java}
Can not create the managed table('`default`.`t1`'). The associated 
location('file:***/t1') already exists.
{code}


> Fix issue of the createTable when externalCatalog is InMemoryCatalog
> --------------------------------------------------------------------
>
>                 Key: SPARK-35356
>                 URL: https://issues.apache.org/jira/browse/SPARK-35356
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: yikf
>            Priority: Minor
>
> In the case of externalCatalog is InMemoryCatalog, Using a different 
> SparkSession will report the following error when create the same table , as 
> follow
> {code:java}
> sparkSession.sql("create table if not exists t1 (id Int) using orc")
> sparkSession.sql("insert into t1 values(1)")
> sparkSession2.sql("create table if not exists t1 (id Int) using orc") // will 
> report error as follow
> {code}
>  
> {code:java}
> Can not create the managed table('`default`.`t1`'). The associated 
> location('file:***/t1') already exists.
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to