cloud-fan commented on a change in pull request #25243:
[SPARK-28498][SQL][TEST] always create a fresh copy of the SparkSession before
each test
URL: https://github.com/apache/spark/pull/25243#discussion_r307386875
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala
##########
@@ -477,22 +476,18 @@ class DataFrameSuite extends QueryTest with
SharedSQLContext {
}
test("withColumns: case sensitive") {
- withSQLConf(SQLConf.CASE_SENSITIVE.key -> "true") {
- val df = testData.toDF().withColumns(Seq("newCol1", "newCOL1"),
+ spark.sessionState.conf.setConf(SQLConf.CASE_SENSITIVE, true)
Review comment:
> I just wonder if we can have less changes when we replace
SharedSparkSession to SharedSparkSessionWithFreshCopy.
Yes we can. But I want to demonstrate how `SharedSparkSessionWithFreshCopy`
simplifies the tests, and that's why I made so many changes to `DataFrameSuite`.
Later when I add new tests to an existing test suite, I will use
`SharedSparkSessionWithFreshCopy` without touching the existing tests, but stop
using `withSQLConf`, `withTempView` in the newly added test cases.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]