[ 
https://issues.apache.org/jira/browse/SPARK-3306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14117820#comment-14117820
 ] 

Yan commented on SPARK-3306:
----------------------------

I am afraid that I was not clear in that the resources are to be shared across 
different tasks, task sets and task waves, instead of letting each task make 
the connection by itself, which is very inefficient. For that purpose, I feel 
Spark conf is not enough and executor needs to be enhanced with hooks to 
initialize and stop the uses of the long running resources.

> Addition of external resource dependency in executors
> -----------------------------------------------------
>
>                 Key: SPARK-3306
>                 URL: https://issues.apache.org/jira/browse/SPARK-3306
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Yan
>
> Currently, Spark executors only support static and read-only external 
> resources of side files and jar files. With emerging disparate data sources, 
> there is a need to support more versatile external resources, such as 
> connections to data sources, to facilitate efficient data accesses to the 
> sources. For one, the JDBCRDD, with some modifications,  could benefit from 
> this feature by reusing established JDBC connections from the same Spark 
> context before.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to