[ 
https://issues.apache.org/jira/browse/SPARK-39897?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17832923#comment-17832923
 ] 

Vladimir Prus commented on SPARK-39897:
---------------------------------------

FWIW, we ran into this problem using Spark 3.5 on a production code. I don't 
know how to reproduce this reliably.

 

That said, is there any case where recursively calling allocatePage, without 
even any delay, is gonna work out?

> StackOverflowError in TaskMemoryManager
> ---------------------------------------
>
>                 Key: SPARK-39897
>                 URL: https://issues.apache.org/jira/browse/SPARK-39897
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.7
>            Reporter: Andrew Ray
>            Priority: Minor
>
> I have observed the following error that looks to stem from 
> TaskMemoryManager.allocatePage making a recursive call to itself when a page 
> can not be allocated. I'm observing this in Spark 2.4 but since the relevant 
> code is still the same in master this is likely still a potential point of 
> failure in current versions. Prioritizing this as minor as this looks to be a 
> very uncommon outcome as I can not find any other reports of a similar nature.
> {code:java}
> Py4JJavaError: An error occurred while calling o625.saveAsTable.
> : org.apache.spark.SparkException: Job aborted.
>     at 
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:198)
>     at 
> org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:170)
>     at 
> org.apache.spark.sql.execution.datasources.DataSource.writeAndRead(DataSource.scala:503)
>     at 
> org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.saveDataIntoTable(createDataSourceTables.scala:217)
>     at 
> org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:177)
>     at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
>     at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
>     at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
>     at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
>     at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
>     at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>     at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>     at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>     at 
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>     at 
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>     at 
> org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:676)
>     at 
> org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
>     at 
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>     at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>     at 
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>     at 
> org.apache.spark.sql.DataFrameWriter.createTable(DataFrameWriter.scala:474)
>     at 
> org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:453)
>     at 
> org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:409)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
>     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>     at py4j.Gateway.invoke(Gateway.java:282)
>     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
>     at py4j.commands.CallCommand.execute(CallCommand.java:79)
>     at py4j.GatewayConnection.run(GatewayConnection.java:238)
>     at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.StackOverflowError
>     at 
> java.util.concurrent.ConcurrentHashMap.putVal(ConcurrentHashMap.java:1012)
>     at 
> java.util.concurrent.ConcurrentHashMap.putIfAbsent(ConcurrentHashMap.java:1535)
>     at java.lang.ClassLoader.getClassLoadingLock(ClassLoader.java:457)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:398)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>     at 
> java.util.ResourceBundle$RBClassLoader.loadClass(ResourceBundle.java:512)
>     at java.util.ResourceBundle$Control.newBundle(ResourceBundle.java:2657)
>     at java.util.ResourceBundle.loadBundle(ResourceBundle.java:1518)
>     at java.util.ResourceBundle.findBundle(ResourceBundle.java:1482)
>     at java.util.ResourceBundle.findBundle(ResourceBundle.java:1436)
>     at java.util.ResourceBundle.findBundle(ResourceBundle.java:1436)
>     at java.util.ResourceBundle.getBundleImpl(ResourceBundle.java:1370)
>     at java.util.ResourceBundle.getBundle(ResourceBundle.java:899)
>     at sun.util.resources.LocaleData$1.run(LocaleData.java:167)
>     at sun.util.resources.LocaleData$1.run(LocaleData.java:163)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at sun.util.resources.LocaleData.getBundle(LocaleData.java:163)
>     at sun.util.resources.LocaleData.getDateFormatData(LocaleData.java:127)
>     at java.text.DateFormatSymbols.initializeData(DateFormatSymbols.java:710)
>     at java.text.DateFormatSymbols.<init>(DateFormatSymbols.java:145)
>     at 
> sun.util.locale.provider.DateFormatSymbolsProviderImpl.getInstance(DateFormatSymbolsProviderImpl.java:85)
>     at 
> java.text.DateFormatSymbols.getProviderInstance(DateFormatSymbols.java:364)
>     at java.text.DateFormatSymbols.getInstance(DateFormatSymbols.java:340)
>     at java.util.Calendar.getDisplayName(Calendar.java:2110)
>     at java.text.SimpleDateFormat.subFormat(SimpleDateFormat.java:1125)
>     at java.text.SimpleDateFormat.format(SimpleDateFormat.java:966)
>     at java.text.SimpleDateFormat.format(SimpleDateFormat.java:936)
>     at java.text.DateFormat.format(DateFormat.java:345)
>     at 
> org.apache.log4j.helpers.PatternParser$DatePatternConverter.convert(PatternParser.java:443)
>     at 
> org.apache.log4j.helpers.PatternConverter.format(PatternConverter.java:65)
>     at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
>     at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
>     at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
>     at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
>     at 
> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
>     at org.apache.log4j.Category.callAppenders(Category.java:206)
>     at org.apache.log4j.Category.forcedLog(Category.java:391)
>     at org.apache.log4j.Category.log(Category.java:856)
>     at org.slf4j.impl.Log4jLoggerAdapter.warn(Log4jLoggerAdapter.java:421)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:304)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:312)
>     at 
> org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:...
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to