HyukjinKwon commented on a change in pull request #28085: 
[SPARK-29641][PYTHON][CORE] Stage Level Sched: Add python api's and tests
URL: https://github.com/apache/spark/pull/28085#discussion_r407806089
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/resource/ExecutorResourceRequests.scala
 ##########
 @@ -38,6 +39,8 @@ private[spark] class ExecutorResourceRequests() extends 
Serializable {
 
   def requests: Map[String, ExecutorResourceRequest] = 
_executorResources.asScala.toMap
 
+  def requestsJMap: JMap[String, ExecutorResourceRequest] = requests.asJava
 
 Review comment:
   Hm .. I am still hesitant about having different methods for Java and Scala 
in particular given that It is legitimate for Scala side to use Java side (the 
opposite is not).
   
   My impression of Spark APIs are in general same or similar API usage across 
languages. There are many examples that uses Java classes, for example, in 
Structured Streaming as an example. e.g) `StateOperatorProgress`, 
`StreamingQueryProgress`, etc.
   
   So, we should probably keep it consistent with other instances, instead of 
`TaskContextImpl`. Seems we marked `resourceJMap` as `Evolving` so looks not 
too late to change .. though ..

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to