This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new d571cbf [SPARK-31849][PYTHON][SQL][FOLLOW-UP] Deduplicate and reuse
Utils.exceptionString in Python exception handling
d571cbf is described below
commit d571cbfa46048518bdda5ca0c858c769149dac7d
Author: HyukjinKwon <[email protected]>
AuthorDate: Mon Jun 8 15:18:42 2020 +0900
[SPARK-31849][PYTHON][SQL][FOLLOW-UP] Deduplicate and reuse
Utils.exceptionString in Python exception handling
### What changes were proposed in this pull request?
This PR proposes to use existing util
`org.apache.spark.util.Utils.exceptionString` for the same codes at:
```python
jwriter = jvm.java.io.StringWriter()
e.printStackTrace(jvm.java.io.PrintWriter(jwriter))
stacktrace = jwriter.toString()
```
### Why are the changes needed?
To deduplicate codes. Plus, less communication between JVM and Py4j.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manually tested.
Closes #28749 from HyukjinKwon/SPARK-31849-followup.
Authored-by: HyukjinKwon <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
---
python/pyspark/sql/utils.py | 5 +----
1 file changed, 1 insertion(+), 4 deletions(-)
diff --git a/python/pyspark/sql/utils.py b/python/pyspark/sql/utils.py
index 3fd7047..1dbea12 100644
--- a/python/pyspark/sql/utils.py
+++ b/python/pyspark/sql/utils.py
@@ -97,11 +97,8 @@ class UnknownException(CapturedException):
def convert_exception(e):
s = e.toString()
c = e.getCause()
+ stacktrace =
SparkContext._jvm.org.apache.spark.util.Utils.exceptionString(e)
- jvm = SparkContext._jvm
- jwriter = jvm.java.io.StringWriter()
- e.printStackTrace(jvm.java.io.PrintWriter(jwriter))
- stacktrace = jwriter.toString()
if s.startswith('org.apache.spark.sql.AnalysisException: '):
return AnalysisException(s.split(': ', 1)[1], stacktrace, c)
if s.startswith('org.apache.spark.sql.catalyst.analysis'):
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]