This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 19193854c759 [SPARK-46315][PYTHON][TESTS] Test invalid key for
spark.conf.get (pyspark.sql.conf)
19193854c759 is described below
commit 19193854c759c4f7c90aad191906dc799c7a7341
Author: Hyukjin Kwon <[email protected]>
AuthorDate: Fri Dec 8 15:10:17 2023 +0900
[SPARK-46315][PYTHON][TESTS] Test invalid key for spark.conf.get
(pyspark.sql.conf)
### What changes were proposed in this pull request?
This PR adds tests for negative cases for `spark.conf.get`
(`pyspark.sql.conf`)
### Why are the changes needed?
To improve the test coverage.
https://app.codecov.io/gh/apache/spark/blob/master/python%2Fpyspark%2Fsql%2Fconf.py
### Does this PR introduce _any_ user-facing change?
No, test-only
### How was this patch tested?
Manually ran the new unittest.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #44245 from HyukjinKwon/SPARK-46315.
Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
---
python/pyspark/sql/tests/test_conf.py | 14 +++++++++++++-
1 file changed, 13 insertions(+), 1 deletion(-)
diff --git a/python/pyspark/sql/tests/test_conf.py
b/python/pyspark/sql/tests/test_conf.py
index 15722c2c57a4..9b939205b1d1 100644
--- a/python/pyspark/sql/tests/test_conf.py
+++ b/python/pyspark/sql/tests/test_conf.py
@@ -16,7 +16,7 @@
#
from decimal import Decimal
-from pyspark.errors import IllegalArgumentException
+from pyspark.errors import IllegalArgumentException, PySparkTypeError
from pyspark.testing.sqlutils import ReusedSQLTestCase
@@ -63,6 +63,18 @@ class ConfTestsMixin:
with self.assertRaises(Exception):
spark.conf.set("foo", Decimal(1))
+ with self.assertRaises(PySparkTypeError) as pe:
+ spark.conf.get(123)
+
+ self.check_error(
+ exception=pe.exception,
+ error_class="NOT_STR",
+ message_parameters={
+ "arg_name": "key",
+ "arg_type": "int",
+ },
+ )
+
spark.conf.unset("foo")
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]