This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new aee9f60b3966 [SPARK-49600][PYTHON] Remove `Python 3.6 and
older`-related logic from `try_simplify_traceback`
aee9f60b3966 is described below
commit aee9f60b39669c7f32152a7f754e611de8af2592
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Wed Sep 11 10:33:45 2024 -0700
[SPARK-49600][PYTHON] Remove `Python 3.6 and older`-related logic from
`try_simplify_traceback`
### What changes were proposed in this pull request?
Apache Spark 4.0.0 supports only Python 3.9+.
- #46228
### Why are the changes needed?
To simplify and clarify the logic. I manually confirmed that this is the
last logic about `sys.version_info` and `(3, 7)`.
```
$ git grep 'sys.version_info' | grep '(3, 7)'
python/pyspark/util.py: if sys.version_info[:2] < (3, 7):
python/pyspark/util.py: if "pypy" not in
platform.python_implementation().lower() and sys.version_info[:2] >= (3, 7):
```
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #48078 from dongjoon-hyun/SPARK-49600.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
python/pyspark/util.py | 6 +-----
1 file changed, 1 insertion(+), 5 deletions(-)
diff --git a/python/pyspark/util.py b/python/pyspark/util.py
index 205e3d957a41..cca44435efe6 100644
--- a/python/pyspark/util.py
+++ b/python/pyspark/util.py
@@ -262,10 +262,6 @@ def try_simplify_traceback(tb: TracebackType) ->
Optional[TracebackType]:
if "pypy" in platform.python_implementation().lower():
# Traceback modification is not supported with PyPy in PySpark.
return None
- if sys.version_info[:2] < (3, 7):
- # Traceback creation is not supported Python < 3.7.
- # See https://bugs.python.org/issue30579.
- return None
import pyspark
@@ -791,7 +787,7 @@ def is_remote_only() -> bool:
if __name__ == "__main__":
- if "pypy" not in platform.python_implementation().lower() and
sys.version_info[:2] >= (3, 7):
+ if "pypy" not in platform.python_implementation().lower() and
sys.version_info[:2] >= (3, 9):
import doctest
import pyspark.util
from pyspark.core.context import SparkContext
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]