This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new 04f240b3780 [SPARK-42181][PYTHON][TESTS] Skip `torch` tests when
`torch` is not installed
04f240b3780 is described below
commit 04f240b37806c853553d8b7478f96199b2be7060
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Wed Jan 25 19:48:46 2023 +0900
[SPARK-42181][PYTHON][TESTS] Skip `torch` tests when `torch` is not
installed
### What changes were proposed in this pull request?
This PR aims to skip `torch` tests when `torch` is not installed.
### Why are the changes needed?
This will enable CI environments to run tests without `torch` test
dependency requirements.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs and do the manual tests on a system without `torch`.
```
$ python/run-tests --testnames pyspark.ml.torch.tests.test_distributor
Running PySpark tests. Output is in
/Users/dongjoon/APACHE/spark-merge/python/unit-tests.log
Will test against the following Python executables: ['python3.9']
Will test the following Python tests:
['pyspark.ml.torch.tests.test_distributor']
python3.9 python_implementation is CPython
python3.9 version is: Python 3.9.16
Starting test(python3.9): pyspark.ml.torch.tests.test_distributor (temp
output:
/Users/dongjoon/APACHE/spark-merge/python/target/9347caa4-6c4d-44cb-aeec-da87c2aa2749/python3.9__pyspark.ml.torch.tests.test_distributor__jobzp1bh.log)
Finished test(python3.9): pyspark.ml.torch.tests.test_distributor (0s) ...
17 tests were skipped
Tests passed in 0 seconds
Skipped tests in pyspark.ml.torch.tests.test_distributor with python3.9:
test_create_torchrun_command
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_encryption_fails
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_encryption_passes
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_get_num_tasks_fails
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_validate_correct_inputs
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_validate_incorrect_inputs
(pyspark.ml.torch.tests.test_distributor.TorchDistributorBaselineUnitTests) ...
SKIP (0.000s)
test_dist_training_succeeds
(pyspark.ml.torch.tests.test_distributor.TorchDistributorDistributedUnitTests)
... SKIP (0.000s)
test_distributed_file_with_pytorch
(pyspark.ml.torch.tests.test_distributor.TorchDistributorDistributedUnitTests)
... SKIP (0.000s)
test_end_to_end_run_distributedly
(pyspark.ml.torch.tests.test_distributor.TorchDistributorDistributedUnitTests)
... SKIP (0.000s)
test_get_num_tasks_distributed
(pyspark.ml.torch.tests.test_distributor.TorchDistributorDistributedUnitTests)
... SKIP (0.000s)
test_end_to_end_run_locally
(pyspark.ml.torch.tests.test_distributor.TorchDistributorLocalUnitTests) ...
SKIP (0.000s)
test_get_gpus_owned_local
(pyspark.ml.torch.tests.test_distributor.TorchDistributorLocalUnitTests) ...
SKIP (0.000s)
test_get_num_tasks_locally
(pyspark.ml.torch.tests.test_distributor.TorchDistributorLocalUnitTests) ...
SKIP (0.000s)
test_local_file_with_pytorch
(pyspark.ml.torch.tests.test_distributor.TorchDistributorLocalUnitTests) ...
SKIP (0.000s)
test_local_training_succeeds
(pyspark.ml.torch.tests.test_distributor.TorchDistributorLocalUnitTests) ...
SKIP (0.000s)
test_check_parent_alive
(pyspark.ml.torch.tests.test_distributor.TorchWrapperUnitTests) ... SKIP
(0.000s)
test_clean_and_terminate
(pyspark.ml.torch.tests.test_distributor.TorchWrapperUnitTests) ... SKIP
(0.000s)
```
Closes #39737 from dongjoon-hyun/SPARK-42181.
Lead-authored-by: Dongjoon Hyun <[email protected]>
Co-authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
(cherry picked from commit 9835476a33ee0578be0291c649550c5fc9071ef3)
Signed-off-by: Hyukjin Kwon <[email protected]>
---
python/pyspark/ml/torch/tests/test_distributor.py | 10 ++++++++++
1 file changed, 10 insertions(+)
diff --git a/python/pyspark/ml/torch/tests/test_distributor.py
b/python/pyspark/ml/torch/tests/test_distributor.py
index 40ad120b90c..0f4a4a23dc0 100644
--- a/python/pyspark/ml/torch/tests/test_distributor.py
+++ b/python/pyspark/ml/torch/tests/test_distributor.py
@@ -29,6 +29,12 @@ from typing import Callable, Dict, Any
import unittest
from unittest.mock import patch
+have_torch = True
+try:
+ import torch # noqa: F401
+except ImportError:
+ have_torch = False
+
from pyspark import SparkConf, SparkContext
from pyspark.ml.torch.distributor import TorchDistributor, get_gpus_owned
from pyspark.ml.torch.torch_run_process_wrapper import clean_and_terminate,
check_parent_alive
@@ -116,6 +122,7 @@ def create_training_function(mnist_dir_path: str) ->
Callable:
return train_fn
[email protected](not have_torch, "torch is required")
class TorchDistributorBaselineUnitTests(unittest.TestCase):
def setUp(self) -> None:
conf = SparkConf()
@@ -271,6 +278,7 @@ class TorchDistributorBaselineUnitTests(unittest.TestCase):
self.delete_env_vars(input_env_vars)
[email protected](not have_torch, "torch is required")
class TorchDistributorLocalUnitTests(unittest.TestCase):
def setUp(self) -> None:
class_name = self.__class__.__name__
@@ -377,6 +385,7 @@ class TorchDistributorLocalUnitTests(unittest.TestCase):
self.assertEqual(output, "success")
[email protected](not have_torch, "torch is required")
class TorchDistributorDistributedUnitTests(unittest.TestCase):
def setUp(self) -> None:
class_name = self.__class__.__name__
@@ -456,6 +465,7 @@ class
TorchDistributorDistributedUnitTests(unittest.TestCase):
self.assertEqual(output, "success")
[email protected](not have_torch, "torch is required")
class TorchWrapperUnitTests(unittest.TestCase):
def test_clean_and_terminate(self) -> None:
def kill_task(task: "subprocess.Popen") -> None:
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]