itholic commented on code in PR #46215:
URL: https://github.com/apache/spark/pull/46215#discussion_r1580282285


##########
python/pyspark/errors/utils.py:
##########
@@ -197,6 +197,16 @@ def with_origin_to_class(cls: Type[T]) -> Type[T]:
     """
     if os.environ.get("PYSPARK_PIN_THREAD", "true").lower() == "true":
         for name, method in cls.__dict__.items():
-            if callable(method) and name != "__init__":
+            # Excluding Python magic methods that do not utilize JVM functions.
+            if callable(method) and name not in (
+                "__init__",
+                "__new__",
+                "__getattr__",
+                "__getitem__",

Review Comment:
   Seems like all the Column APIs directly calls the Spark API by itself other 
than `__getattr__`. But I think we can just remove it as well to keep 
consistency.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to