[
https://issues.apache.org/jira/browse/SPARK-42650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17696070#comment-17696070
]
XiDuo You commented on SPARK-42650:
-----------------------------------
To be clear, it is the issue of Spark 3.2.3. Spark3.2.1, 3.3.x and master are
fine.
It can be reproduced by:
{code:java}
CREATE TABLE IF NOT EXISTS spark32_overwrite(amt1 int) STORED AS ORC;
CREATE TABLE IF NOT EXISTS spark32_overwrite2(amt1 long) STORED AS ORC;
INSERT OVERWRITE TABLE spark32_overwrite2 select 6000044164;
set spark.sql.ansi.enabled=true;
INSERT OVERWRITE TABLE spark32_overwrite select amt1 from (select cast(amt1 as
int) as amt1 from spark32_overwrite2 distribute by amt1);
{code}
> link issue SPARK-42550
> ----------------------
>
> Key: SPARK-42650
> URL: https://issues.apache.org/jira/browse/SPARK-42650
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.2.3
> Reporter: kevinshin
> Priority: Major
>
> When use
> [KyuubiSparkSQLExtension|https://kyuubi.readthedocs.io/en/v1.6.1-incubating/extensions/engines/spark/]
> and when a `insert overwrite` statment meet exception ,a no partion table's
> home directory will lost ,partion table will lost partion directory.
>
> my spark-defaults.conf config :
> spark.sql.extensions org.apache.kyuubi.sql.KyuubiSparkSQLExtension
>
> because I can't reopen SPARK-42550 , for detail and reproduce please
> reference:
> https://issues.apache.org/jira/browse/SPARK-42550
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]