HeartSaVioR commented on a change in pull request #27158: 
[MINOR][BUILD][PYSPARK] Disable test_memory_limit test temporarily
URL: https://github.com/apache/spark/pull/27158#discussion_r365044172
 
 

 ##########
 File path: python/pyspark/tests/test_worker.py
 ##########
 @@ -175,13 +175,12 @@ def test_reuse_worker_of_parallelize_xrange(self):
         for pid in current_pids:
             self.assertTrue(pid in previous_pids)
 
-
 @unittest.skipIf(
     not has_resource_module,
     "Memory limit feature in Python worker is dependent on "
     "Python's 'resource' module; however, not found.")
 class WorkerMemoryTest(PySparkTestCase):
-
+    @unittest.skip("disabled temporarily since it's failing consistently")
     def test_memory_limit(self):
         self.sc._conf.set("spark.executor.pyspark.memory", "1m")
 
 Review comment:
   OK let me do it instead. Let's modify the PR title/description after 
confirming it works.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to