tgravescs commented on a change in pull request #28085: 
[SPARK-29641][PYTHON][CORE] Stage Level Sched: Add python api's and tests
URL: https://github.com/apache/spark/pull/28085#discussion_r404215247
 
 

 ##########
 File path: python/pyspark/rdd.py
 ##########
 @@ -2483,6 +2485,32 @@ def _is_barrier(self):
         """
         return self._jrdd.rdd().isBarrier()
 
+    def withResources(self, profile):
+        """
+        .. note:: Experimental
+
+        Specify a ResourceProfile to use when calculating this RDD. This is 
only supported on
+        certain cluster managers and currently requires dynamic allocation to 
be enabled.
+        It will result in new executors with the resources specified being 
acquired to
+        calculate the RDD.
+
+        .. versionadded:: 3.1.0
+        """
+        self.has_resourceProfile = True
+        self._jrdd.withResources(profile._jResourceProfile)
+        return self
+
+    def getResourceProfile(self):
 
 Review comment:
   I don't know what you mean, we are going with these apis. This is only in 
master branch, not in branch-3.0.  
https://issues.apache.org/jira/browse/SPARK-29150 is to make the RDD ones 
public. At this point the core feature is complete so making the python ones 
and java ones public doesn't hurt anything and they will be only be in 3.1.0. 
Sorry for the confusion there, its just timing on when things got merged and it 
was across the 3.0 and 3.1 boundary.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to