MaxGekk commented on issue #27166: [SPARK-30482][SQL][CORE][TESTS] Add 
sub-class of `AppenderSkeleton` reusable in tests
URL: https://github.com/apache/spark/pull/27166#issuecomment-573388568
 
 
   Not sure how it is related to my changes:
   ```
   ======================================================================
   ERROR: test_memory_limit (pyspark.tests.test_worker.WorkerMemoryTest)
   ----------------------------------------------------------------------
   Traceback (most recent call last):
     File 
"/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/tests/test_worker.py",
 line 193, in test_memory_limit
       actual = rdd.map(lambda _: getrlimit()).collect()
     File 
"/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/rdd.py", line 
889, in collect
       sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
     File 
"/home/jenkins/workspace/SparkPullRequestBuilder/python/lib/py4j-0.10.8.1-src.zip/py4j/java_gateway.py",
 line 1286, in __call__
       answer, self.gateway_client, self.target_id, self.name)
     File 
"/home/jenkins/workspace/SparkPullRequestBuilder/python/lib/py4j-0.10.8.1-src.zip/py4j/protocol.py",
 line 328, in get_return_value
       format(target_id, ".", name), value)
   Py4JJavaError: An error occurred while calling 
z:org.apache.spark.api.python.PythonRDD.collectAndServe.
   : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 
in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 
(TID 0, amp-jenkins-worker-05.amp, executor driver): 
org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to