GitHub user HyukjinKwon opened a pull request:
https://github.com/apache/spark/pull/20830
[SPARK-23691][PYTHON] Use sql_conf util in PySpark tests where possible
## What changes were proposed in this pull request?
https://github.com/apache/spark/commit/d6632d185e147fcbe6724545488ad80dce20277e
added an useful util
```python
@contextmanager
def sql_conf(self, pairs):
...
```
to allow configuration set/unset within a block:
```python
with self.sql_conf({"spark.blah.blah.blah", "blah"})
# test codes
```
This PR proposes to use this util where possible in PySpark tests.
Note that there look already few places affecting tests without restoring
the original value back in unittest classes.
## How was this patch tested?
Manually tested via:
```
./run-tests --modules=pyspark-sql --python-executables=python2
./run-tests --modules=pyspark-sql --python-executables=python3
```
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/HyukjinKwon/spark cleanup-sql-conf
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/20830.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #20830
----
commit 89cf69be7ae00f571c51de402067928b663c5a45
Author: hyukjinkwon <gurwls223@...>
Date: 2018-03-15T04:16:18Z
Use sql_conf util in PySpark tests where possible
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]