[ 
https://issues.apache.org/jira/browse/SPARK-45040?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yangbo Yu updated SPARK-45040:
------------------------------
    Attachment: Screenshot Capture - 2023-08-31 - 19-02-31.png

> Caught Hive MetaException attempting to get partition metadata by filter from 
> Hive
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-45040
>                 URL: https://issues.apache.org/jira/browse/SPARK-45040
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 2.4.0
>         Environment: AWS Glue
>            Reporter: Yangbo Yu
>            Priority: Critical
>         Attachments: Screenshot Capture - 2023-08-31 - 19-02-31.png
>
>
> We are integrating Spark 2.4 with our AWS Glue ETL job.
> And we recently realized that a lot of our jobs are failed because of the 
> below error:
> {{Exception in User Class: java.lang.RuntimeException : Caught Hive 
> MetaException attempting to get partition metadata by filter from Hive. You 
> can set the Spark configuration setting 
> spark.sql.hive.manageFilesourcePartitions to false to work around this 
> problem, however this will result in degraded performance. Please report a 
> bug: }}{{https://issues.apache.org/jira/browse/SPARK
> }}
> This error first happens from Aug 30th, and it occurs from time to time. It 
> is gone for several hours and occurs again. During the time that the error 
> occurs, most of the jobs fail, only few succeed. 
> And we tried to set {{spark.sql.hive.manageFilesourcePartitions}} to false 
> but it did not work out. Some other issue is coming out.
> Can you look into the error and let me know if there is any work around to 
> mitigate the issue? 
> Let me know if you need anything from my end.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to