[ https://issues.apache.org/jira/browse/SPARK-26369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16721010#comment-16721010 ]
Fu Chen commented on SPARK-26369: --------------------------------- ok, thank your reply, I found solution inĀ [SPARK-20589|https://issues.apache.org/jira/browse/SPARK-20589] > How to limit Spark concurrent tasks number in one job? > ------------------------------------------------------ > > Key: SPARK-26369 > URL: https://issues.apache.org/jira/browse/SPARK-26369 > Project: Spark > Issue Type: Question > Components: Scheduler > Affects Versions: 2.1.0, 2.2.0, 2.3.2, 2.4.0 > Reporter: Fu Chen > Priority: Major > > Hi All, > it is possible make fair scheduler pools pluggable? so that we can > implement our own SchedulingAlgorithm. In our case, we want to limit the > max tasks number of one job which will load data from mysql database, if we > set a bigger executer.number * cores.number, it will trigger alarm. Or we > can do this in an other way? -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org