Jungtaek Lim created SPARK-30480:
------------------------------------

             Summary: Pyspark test "test_memory_limit" fails consistently
                 Key: SPARK-30480
                 URL: https://issues.apache.org/jira/browse/SPARK-30480
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.0.0
            Reporter: Jungtaek Lim


I'm seeing consistent pyspark test failures on multiple PRs 
([#26955|https://github.com/apache/spark/pull/26955], 
[#26201|https://github.com/apache/spark/pull/26201], 
[#27064|https://github.com/apache/spark/pull/27064]), and they all failed from 
"test_memory_limit".

[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116422/testReport]

[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116438/testReport]

[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116429/testReport]

[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116366/testReport]

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to