This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new c65b645  [SPARK-32714][FOLLOW-UP][PYTHON] Address pyspark.install 
typing errors
c65b645 is described below

commit c65b64552f947a7eaf4f379edbdce05daa923363
Author: zero323 <mszymkiew...@gmail.com>
AuthorDate: Sun Sep 27 16:21:23 2020 +0900

    [SPARK-32714][FOLLOW-UP][PYTHON] Address pyspark.install typing errors
    
    ### What changes were proposed in this pull request?
    
    This PR adds two `type: ignores`, one in `pyspark.install` and one in 
related tests.
    
    ### Why are the changes needed?
    
    To satisfy MyPy type checks. It seems like we originally missed some 
changes that happened around merge of
    
https://github.com/apache/spark/commit/31a16fbb405a19dc3eb732347e0e1f873b16971d
    
    ```
    python/pyspark/install.py:30: error: Need type annotation for 
'UNSUPPORTED_COMBINATIONS' (hint: "UNSUPPORTED_COMBINATIONS: List[<type>] = 
...")  [var-annotated]
    python/pyspark/tests/test_install_spark.py:105: error: Cannot find 
implementation or library stub for module named 'xmlrunner'  [import]
    python/pyspark/tests/test_install_spark.py:105: note: See 
https://mypy.readthedocs.io/en/latest/running_mypy.html#missing-imports
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    - Existing tests.
    - MyPy tests
        ```
        mypy --show-error-code --no-incremental --config python/mypy.ini 
python/pyspark
       ```
    
    Closes #29878 from zero323/SPARK-32714-FOLLOW-UP.
    
    Authored-by: zero323 <mszymkiew...@gmail.com>
    Signed-off-by: HyukjinKwon <gurwls...@apache.org>
---
 python/pyspark/install.py                  | 2 +-
 python/pyspark/tests/test_install_spark.py | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/python/pyspark/install.py b/python/pyspark/install.py
index 84dd2c9..2de7b21 100644
--- a/python/pyspark/install.py
+++ b/python/pyspark/install.py
@@ -27,7 +27,7 @@ DEFAULT_HADOOP = "hadoop3.2"
 DEFAULT_HIVE = "hive2.3"
 SUPPORTED_HADOOP_VERSIONS = ["hadoop2.7", "hadoop3.2", "without-hadoop"]
 SUPPORTED_HIVE_VERSIONS = ["hive2.3"]
-UNSUPPORTED_COMBINATIONS = [
+UNSUPPORTED_COMBINATIONS = [  # type: ignore
 ]
 
 
diff --git a/python/pyspark/tests/test_install_spark.py 
b/python/pyspark/tests/test_install_spark.py
index 6f9949a..f761e00 100644
--- a/python/pyspark/tests/test_install_spark.py
+++ b/python/pyspark/tests/test_install_spark.py
@@ -102,7 +102,7 @@ if __name__ == "__main__":
     from pyspark.tests.test_install_spark import *  # noqa: F401
 
     try:
-        import xmlrunner
+        import xmlrunner  # type: ignore[import]
         testRunner = xmlrunner.XMLTestRunner(output='target/test-reports', 
verbosity=2)
     except ImportError:
         testRunner = None


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to