itholic commented on code in PR #39882:
URL: https://github.com/apache/spark/pull/39882#discussion_r1099675341
##########
python/pyspark/sql/tests/connect/test_parity_column.py:
##########
@@ -32,7 +32,7 @@
class ColumnParityTests(ColumnTestsMixin, ReusedConnectTestCase):
- # TODO(SPARK-42017): Different error type AnalysisException vs
SparkConnectAnalysisException
+ # TODO(SPARK-42017): df["bad_key"] does not raise AnalysisException
Review Comment:
+1
##########
python/pyspark/sql/streaming/query.py:
##########
@@ -387,7 +390,7 @@ def exception(self) -> Optional[StreamingQueryException]:
je = self._jsq.exception().get()
msg = je.toString().split(": ", 1)[1] # Drop the Java
StreamingQueryException type info
stackTrace = "\n\t at ".join(map(lambda x: x.toString(),
je.getStackTrace()))
- return StreamingQueryException(msg, stackTrace, je.getCause())
+ return CapturedStreamingQueryException(msg, stackTrace,
je.getCause())
Review Comment:
Just for clear understanding, will it raise `StreamingQueryException` in
user space, right ?
I just want to clarify because we don't use such an alias for connect
exceptions for example:
```python
from pyspark.errors.exceptions.connect import (
AnalysisException as ConnectAnalysisException
)
```
##########
python/pyspark/sql/tests/streaming/test_streaming.py:
##########
@@ -254,7 +254,7 @@ def test_stream_exception(self):
self._assert_exception_tree_contains_msg(e, "ZeroDivisionError")
finally:
sq.stop()
- self.assertTrue(type(sq.exception()) is StreamingQueryException)
+ self.assertIsInstance(sq.exception(), StreamingQueryException)
Review Comment:
+1
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]