[ 
https://issues.apache.org/jira/browse/SPARK-50768?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot reassigned SPARK-50768:
--------------------------------------

    Assignee:     (was: Apache Spark)

> Pontential file stream leaks caused by task thread interruption
> ---------------------------------------------------------------
>
>                 Key: SPARK-50768
>                 URL: https://issues.apache.org/jira/browse/SPARK-50768
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.5.3
>            Reporter: wuyi
>            Priority: Major
>              Labels: pull-request-available
>
> SPAKR-49980 tried to fix this kind of issue by applying 
> {code:java}
> def tryInitializeResource[R <: Closeable, T](createResource: => 
> R)(initialize: R => T): T = {
>   val resource = createResource
>   try {
>     initialize(resource)
>   } catch {
>     case e: Throwable =>
>       resource.close()
>       throw e
>   }
> } {code}
> when creating resources. This utility function has an issue that `resource` 
> could not be completed initiliazed if interruption happens during 
> `initialize(resource)`. As a result, `resource.close()` could not completely 
> clean up intermediate created resources underlying and leads to resource 
> leak. For exmaple,
>  
> {code:java}
> class MyResource {
>   var underlyingResource = null
>   def initiliaze() {
>     val res = createUnderlyingResource()
>     
>      res.open()
>      // if interruption happens before the assignment to underlyingResource, 
> underlyingResource can be leaked.
>      res.init()      
>     
>     underlyingResource = res
>   }
>   def close(): Unit = {
>    if (underlyingResource != null) {
>      underlyingResource.close()
>    }
>   }
> }{code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to