Github user a-johnston commented on the issue:
https://github.com/apache/spark/pull/21180
+1 for fixing this. A workaround (in this case for the example given on
Jira) has been something like:
```python
class SparkTuple:
def __reduce__(self):
return self.__class__, tuple(self)
class PointSubclass(SparkTuple, Point):
def sum(self):
return self.x + self.y
```
effectively manually marking classes to un-hack but it would be great to
avoid this if possible as the existing behavior is really unexpected.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]