This is an automated email from the ASF dual-hosted git repository.

yangjie01 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new e326b9b6565 [SPARK-41532][DOCS][FOLLOWUP] Regenerate 
`sql-error-conditions-connect-error-class.md`
e326b9b6565 is described below

commit e326b9b6565b1e2b697d41714060e2ff9e424514
Author: yangjie01 <yangji...@baidu.com>
AuthorDate: Wed Aug 2 17:36:46 2023 +0800

    [SPARK-41532][DOCS][FOLLOWUP] Regenerate 
`sql-error-conditions-connect-error-class.md`
    
    ### What changes were proposed in this pull request?
    This pr aims to regenerate `sql-error-conditions-connect-error-class.md` to 
make `SparkThrowableSuite` test pass after 
https://github.com/apache/spark/pull/42256 merged.
    
    ### Why are the changes needed?
    Make Github Actions pass.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass GitHub Actions
    - Manual check:
    
    ```
    build/sbt "core/testOnly *SparkThrowableSuite"
    ```
    
    **Before**
    
    ```
    [info] SparkThrowableSuite:
    [info] - No duplicate error classes (27 milliseconds)
    [info] - Error classes are correctly formatted (63 milliseconds)
    [info] - SQLSTATE invariants (43 milliseconds)
    [info] - Message invariants (13 milliseconds)
    [info] - Message format invariants (19 milliseconds)
    [info] - Error classes match with document *** FAILED *** (41 milliseconds)
    [info]   "...nect plugin: `<msg>`[
    [info]
    [info]   ## SESSION_NOT_SAME
    [info]
    [info]   Both Datasets must belong to the same SparkSession.]" did not 
equal "...nect plugin: `<msg>`[]" The error class document is not up to date. 
Please regenerate it. (SparkThrowableSuite.scala:300)
    [info]   Analysis:
    [info]   "...nect plugin: `<msg>`[
    [info]
    [info] ## SESSION_NOT_SAME
    [info]
    [info] Both Datasets must belong to the same SparkSession.]" -> "...nect 
plugin: `<msg>`[]"
    [info]   org.scalatest.exceptions.TestFailedException:
    [info]   at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
    [info]   at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
    [info]   at 
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
    [info]   at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
    [info]   at 
org.apache.spark.SparkThrowableSuite.$anonfun$new$30(SparkThrowableSuite.scala:300)
    [info]   at 
scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:400)
    [info]   at 
scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:728)
    [info]   at 
scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:728)
    [info]   at 
org.apache.spark.SparkThrowableSuite.$anonfun$new$21(SparkThrowableSuite.scala:259)
    [info]   at 
org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
    [info]   at 
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)
    [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
    [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
    [info]   at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
    [info]   at 
org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)
    [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
    [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
    [info]   at 
org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
    [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
    [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)
    [info]   at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
    [info]   at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
    [info]   at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:69)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
    [info]   at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
    [info]   at scala.collection.immutable.List.foreach(List.scala:431)
    [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    [info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
    [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
    [info]   at 
org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)
    [info]   at org.scalatest.Suite.run(Suite.scala:1114)
    [info]   at org.scalatest.Suite.run$(Suite.scala:1096)
    [info]   at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
    [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
    [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
    [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)
    [info]   at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
    [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
    [info]   at 
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
    [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
    [info]   at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
    [info]   at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
    [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
    [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    [info]   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    [info]   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    [info]   at java.lang.Thread.run(Thread.java:750)
    [info] - Round trip (24 milliseconds)
    [info] - Error class names should contain only capital letters, numbers and 
underscores (6 milliseconds)
    [info] - Check if error class is missing (16 milliseconds)
    [info] - Check if message parameters match message format (4 milliseconds)
    [info] - Error message is formatted (0 milliseconds)
    [info] - Error message does not do substitution on values (0 milliseconds)
    [info] - Try catching legacy SparkError (1 millisecond)
    [info] - Try catching SparkError with error class (0 milliseconds)
    [info] - Try catching internal SparkError (1 millisecond)
    [info] - Get message in the specified format (25 milliseconds)
    [info] - overwrite error classes (72 milliseconds)
    [info] - prohibit dots in error class names (22 milliseconds)
    [info] Run completed in 1 second, 601 milliseconds.
    [info] Total number of tests run: 18
    [info] Suites: completed 1, aborted 0
    [info] Tests: succeeded 17, failed 1, canceled 0, ignored 0, pending 0
    [info] *** 1 TEST FAILED ***
    ```
    
    **After**
    
    ```
    [info] SparkThrowableSuite:
    [info] - No duplicate error classes (27 milliseconds)
    [info] - Error classes are correctly formatted (57 milliseconds)
    [info] - SQLSTATE invariants (42 milliseconds)
    [info] - Message invariants (12 milliseconds)
    [info] - Message format invariants (20 milliseconds)
    [info] - Error classes match with document (48 milliseconds)
    [info] - Round trip (27 milliseconds)
    [info] - Error class names should contain only capital letters, numbers and 
underscores (8 milliseconds)
    [info] - Check if error class is missing (21 milliseconds)
    [info] - Check if message parameters match message format (4 milliseconds)
    [info] - Error message is formatted (1 millisecond)
    [info] - Error message does not do substitution on values (0 milliseconds)
    [info] - Try catching legacy SparkError (1 millisecond)
    [info] - Try catching SparkError with error class (0 milliseconds)
    [info] - Try catching internal SparkError (0 milliseconds)
    [info] - Get message in the specified format (5 milliseconds)
    [info] - overwrite error classes (89 milliseconds)
    [info] - prohibit dots in error class names (24 milliseconds)
    [info] Run completed in 1 second, 584 milliseconds.
    [info] Total number of tests run: 18
    [info] Suites: completed 1, aborted 0
    [info] Tests: succeeded 18, failed 0, canceled 0, ignored 0, pending 0
    [info] All tests passed.
    
    ```
    
    Closes #42293 from LuciferYang/SPARK-41532-FOLLOWUP.
    
    Authored-by: yangjie01 <yangji...@baidu.com>
    Signed-off-by: yangjie01 <yangji...@baidu.com>
---
 docs/sql-error-conditions-connect-error-class.md | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/docs/sql-error-conditions-connect-error-class.md 
b/docs/sql-error-conditions-connect-error-class.md
index 1371e36144e..b110fb98343 100644
--- a/docs/sql-error-conditions-connect-error-class.md
+++ b/docs/sql-error-conditions-connect-error-class.md
@@ -41,4 +41,8 @@ Cannot instantiate Spark Connect plugin because `<cls>` is 
missing a default con
 
 Error instantiating Spark Connect plugin: `<msg>`
 
+## SESSION_NOT_SAME
+
+Both Datasets must belong to the same SparkSession.
+
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to