[
https://issues.apache.org/jira/browse/SPARK-41299?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17640479#comment-17640479
]
André F. commented on SPARK-41299:
----------------------------------
The OOM happens before I can see the query plan on the UI. Is there another way
to obtain it?
> OOM when filter pushdown `last_day` function
> --------------------------------------------
>
> Key: SPARK-41299
> URL: https://issues.apache.org/jira/browse/SPARK-41299
> Project: Spark
> Issue Type: Bug
> Components: Optimizer
> Affects Versions: 3.3.1
> Environment: Spark 3.3.1
> JDK 8 (openjdk version "1.8.0_352")
> Reporter: André F.
> Priority: Major
>
> Using the following transformation on Spark 3.3.1:
> {code:java}
> df.where($"date" === last_day($"date")) {code}
> Where `df` is a dataframe created from a set Parquet files. I'm trying to
> filter dates where they match with the last day of the month of where `date`
> happened.
> Executors are dying with the following error:
> {code:java}
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.util.regex.Pattern.compile(Pattern.java:1722) ~[?:1.8.0_252]
> at java.util.regex.Pattern.<init>(Pattern.java:1352) ~[?:1.8.0_252]
> at java.util.regex.Pattern.compile(Pattern.java:1028) ~[?:1.8.0_252] {code}
> By *disabling* the predicate pushdown rule, the job works normally.
> Also, this works normally on Spark 3.3.0. I also couldn't verify other date
> functions failing on the same way.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]