BryanCutler commented on a change in pull request #30985:
URL: https://github.com/apache/spark/pull/30985#discussion_r557743093



##########
File path: python/pyspark/worker.py
##########
@@ -500,7 +501,10 @@ def main(infile, outfile):
 
             except (resource.error, OSError, ValueError) as e:
                 # not all systems support resource limits, so warn instead of 
failing
-                print("WARN: Failed to set memory limit: {0}\n".format(e), 
file=sys.stderr)
+                warnings.warn(
+                    "Failed to set memory limit: {0}\n".format(e),
+                    ResourceWarning
+                )

Review comment:
       They are sent to stderr as currently, but yeah, I don't believe they 
will end up in a log by default for a remote worker.
   
   There are also similar print statements above this one, what about those?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to