zero323 commented on a change in pull request #30985:
URL: https://github.com/apache/spark/pull/30985#discussion_r551922694



##########
File path: python/pyspark/util.py
##########
@@ -323,6 +323,18 @@ def __del__(self):
                         thread_connection.close()
 
 
+class PySparkWarning(UserWarning):
+    """A base class for PySpark warnings."""
+
+
+class PySparkFutureWarning(PySparkWarning, FutureWarning):
+    """A base class for deprecated PySpark APIs."""
+
+
+class PySparkResourceWarning(PySparkWarning, ResourceWarning):
+    """A base class for PySpark warnings related to resource usage."""

Review comment:
       > I see. I think it's okay not to have the base classes for warnings for 
now - I got that it depends on how individuals think though.
   
   Sure, let me adjust the PR.
   
   > If we need some specific warning types, we could define it and directly 
inherit from built-in warning classes (and then expose it as a module or 
pacakge e.g. 
[pandas.errors](https://github.com/pandas-dev/pandas/blob/master/pandas/errors/__init__.py)).
 Or we could even think about introducing the base classes later when we have 
enough instances to generalize.
   
   I looked at these as well, but I don't see many uses cases for broader 
warning set within PySpark. We deal almost exclusively with deprecation 
warnings and non-critical issues, which might justify dedicated warning 
classes, are normally handled on JVM side.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to