beliefer opened a new pull request #28060: [SPARK-31291][SQL][TEST] Avoid load 
test data if test case not uses them
URL: https://github.com/apache/spark/pull/28060
 
 
   ### What changes were proposed in this pull request?
   `SQLQueryTestSuite` spend 35 minutes time to test.
   I've listed the 10 test cases that took the longest time in the `SQL` module 
below.
   
   Class | Spend time  ↑ | Failure | Skip | Pass | Total test case
   -- | -- | -- | -- | -- | --
   SQLQueryTestSuite | 35 minutes | 0 | 1 | 230 | 231
   TPCDSQuerySuite | 3 minutes 8 seconds | 0 | 0 | 156 | 156
   SQLQuerySuite | 2 minutes 52 seconds | 0 | 0 | 185 | 185
   DynamicPartitionPruningSuiteAEOff | 1 minutes 52 seconds | 0 | 0 | 22 | 22
   DataFrameFunctionsSuite | 1 minutes 37 seconds | 0 | 0 | 102 | 102
   DynamicPartitionPruningSuiteAEOn | 1 minutes 24 seconds | 0 | 0 | 22 | 22
   DataFrameSuite | 1 minutes 14 seconds | 0 | 2 | 157 | 159
   SubquerySuite | 1 minutes 12 seconds | 0 | 1 | 70 | 71
   SingleLevelAggregateHashMapSuite | 1 minutes 1 seconds | 0 | 0 | 50 | 50
   DataFrameAggregateSuite | 59 seconds | 0 | 0 | 50 | 50
   
   I checked the code of `SQLQueryTestSuite` and found `SQLQueryTestSuite` load 
test data repeatedly.
   This PR will improve the performance of `SQLQueryTestSuite`.
   
   
   ### Why are the changes needed?
   Improve the performance of `SQLQueryTestSuite`.
   
   
   ### Does this PR introduce any user-facing change?
   'No'.
   
   
   ### How was this patch tested?
   Jenkins test
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to