[ https://issues.apache.org/jira/browse/SPARK-33032?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang resolved SPARK-33032. --------------------------------- Resolution: Not A Bug It is not a bug after increase driver memory to 20GB. > Should throw SparkException if broadcast large table > ----------------------------------------------------- > > Key: SPARK-33032 > URL: https://issues.apache.org/jira/browse/SPARK-33032 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.1.0 > Reporter: Yuming Wang > Priority: Major > > Spark 3.1: > {noformat} > 20/09/29 17:04:40 WARN TaskMemoryManager: Failed to allocate a page (16777216 > bytes), try again. > 20/09/29 17:04:45 WARN TaskMemoryManager: Failed to allocate a page (16777216 > bytes), try again. > {noformat} > Spark 3.0.1: > {noformat} > Caused by: org.apache.spark.SparkException: Cannot broadcast the table that > is larger than 8GB: 10 GB > at > org.apache.spark.sql.execution.exchange.BroadcastExchangeExec.$anonfun$relationFuture$1(BroadcastExchangeExec.scala:145) > at > org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$1(SQLExecution.scala:182) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org