[ 
https://issues.apache.org/jira/browse/SPARK-39348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17686390#comment-17686390
 ] 

Wei Guo commented on SPARK-39348:
---------------------------------

After PR [https://github.com/apache/spark/pull/26559,] it has been removed.
 * Since Spark 2.4, creating a managed table with nonempty location is not 
allowed. An exception is thrown when attempting to create a managed table with 
nonempty location. To set {{true}} to 
{{spark.sql.legacy.allowCreatingManagedTableUsingNonemptyLocation}} restores 
the previous behavior. This option will be removed in Spark 3.0.

> Create table in overwrite mode fails when interrupted
> -----------------------------------------------------
>
>                 Key: SPARK-39348
>                 URL: https://issues.apache.org/jira/browse/SPARK-39348
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 3.1.1
>            Reporter: Max
>            Priority: Major
>
> When you attempt to rerun an Apache Spark write operation by cancelling the 
> currently running job, the following error occurs:
> {code:java}
> Error: org.apache.spark.sql.AnalysisException: Cannot create the managed 
> table('`testdb`.` testtable`').
> The associated location 
> ('dbfs:/user/hive/warehouse/testdb.db/metastore_cache_ testtable) already 
> exists.;{code}
> This problem can occur if:
>  * The cluster is terminated while a write operation is in progress.
>  * A temporary network issue occurs.
>  * The job is interrupted.
> You can reproduce the problem by following these steps:
> 1. Create a DataFrame:
> {code:java}
> val df = spark.range(1000){code}
> 2. Write the DataFrame to a location in overwrite mode:
> {code:java}
> df.write.mode(SaveMode.Overwrite).saveAsTable("testdb.testtable"){code}
> 3. Cancel the command while it is executing.
> 4. Re-run the {{write}} command.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to