[
https://issues.apache.org/jira/browse/SPARK-44884?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Dipayan Dev updated SPARK-44884:
--------------------------------
Attachment: image-2023-08-20-18-46-53-342.png
> Spark doesn't create SUCCESS file when external path is passed
> --------------------------------------------------------------
>
> Key: SPARK-44884
> URL: https://issues.apache.org/jira/browse/SPARK-44884
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 3.3.0
> Reporter: Dipayan Dev
> Priority: Critical
> Attachments: image-2023-08-20-18-08-38-531.png,
> image-2023-08-20-18-46-53-342.png
>
>
> The issue is not happening in Spark 2.x (I am using 2.4.0), but only in 3.3.0
> Code to reproduce the issue.
>
> {code:java}
> scala> spark.conf.set("spark.sql.orc.char.enabled", true)
> scala> val DF = Seq(("test1", 123)).toDF("name", "num")
> scala> DF.write.option("path",
> "gs://test_dd123/").mode(SaveMode.Overwrite).partitionBy("num").format("orc").saveAsTable("test_schema.table_name")
> 23/08/20 12:31:43 WARN SessionState: METASTORE_FILTER_HOOK will be ignored,
> since hive.security.authorization.manager is set to instance of
> HiveAuthorizerFactory. {code}
> The above code succeeds and creates the External Hive table, but {*}there is
> no SUCCESS file generated{*}. The same code when running spark 2.4.0,
> generating a SUCCESS file.
> Adding the content of the bucket are table creation
>
> !image-2023-08-20-18-08-38-531.png|width=453,height=162!
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]