LuciferYang commented on code in PR #37956:
URL: https://github.com/apache/spark/pull/37956#discussion_r976573144
##########
core/src/main/scala/org/apache/spark/TestUtils.scala:
##########
@@ -285,16 +285,25 @@ private[spark] object TestUtils {
// minimum python supported version changes.
val minimumPythonSupportedVersion: String = "3.7.0"
+ def assumePythonVersionAvailable: Unit =
+ assume(isPythonVersionAvailable,
Review Comment:
```
Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.7+ and R 3.5+. Python
3.7 support is deprecated as of Spark 3.4.0.
```
Do we still promise that Python 3.0 ~ Python 3.6 can pass all tests? If not,
whether it is better to skip the test and explicitly notify?
I have the impression that some cases cannot pass when using Python 3.6, let
me check it again
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]