[jira] [Commented] (SPARK-36173) [CORE] Support getting CPU number in TaskContext

2021-08-04 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17393578#comment-17393578
 ] 

Apache Spark commented on SPARK-36173:
--

User 'sarutak' has created a pull request for this issue:
https://github.com/apache/spark/pull/33645

> [CORE] Support getting CPU number in TaskContext
> 
>
> Key: SPARK-36173
> URL: https://issues.apache.org/jira/browse/SPARK-36173
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 3.1.2
>Reporter: Xiaochang Wu
>Assignee: Xiaochang Wu
>Priority: Major
> Fix For: 3.3.0
>
>
> In stage-level resource scheduling, the allocated 3rd party resources can be 
> obtained in TaskContext using resources() interface, however there is no API 
> to get how many cpus are allocated for the task. Will add a cpus() interface 
> to TaskContext in addition to resources().



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-36173) [CORE] Support getting CPU number in TaskContext

2021-08-04 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17393577#comment-17393577
 ] 

Apache Spark commented on SPARK-36173:
--

User 'sarutak' has created a pull request for this issue:
https://github.com/apache/spark/pull/33645

> [CORE] Support getting CPU number in TaskContext
> 
>
> Key: SPARK-36173
> URL: https://issues.apache.org/jira/browse/SPARK-36173
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 3.1.2
>Reporter: Xiaochang Wu
>Assignee: Xiaochang Wu
>Priority: Major
> Fix For: 3.3.0
>
>
> In stage-level resource scheduling, the allocated 3rd party resources can be 
> obtained in TaskContext using resources() interface, however there is no API 
> to get how many cpus are allocated for the task. Will add a cpus() interface 
> to TaskContext in addition to resources().



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-36173) [CORE] Support getting CPU number in TaskContext

2021-07-15 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17381741#comment-17381741
 ] 

Apache Spark commented on SPARK-36173:
--

User 'xwu99' has created a pull request for this issue:
https://github.com/apache/spark/pull/33385

> [CORE] Support getting CPU number in TaskContext
> 
>
> Key: SPARK-36173
> URL: https://issues.apache.org/jira/browse/SPARK-36173
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 3.1.2
>Reporter: Xiaochang Wu
>Priority: Major
>
> In stage-level resource scheduling, the allocated 3rd party resources can be 
> obtained in TaskContext using resources() interface, however there is no API 
> to get how many cpus are allocated for the task. Will add a cpus() interface 
> to TaskContext in addition to resources().



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org