dongjoon-hyun commented on pull request #28371:
URL: https://github.com/apache/spark/pull/28371#issuecomment-620293684
All Scala/Java/Python test passed, but it's timeouted R testing.
I ran the R UT manually.
```
══ testthat results
═══
[ OK: 13 | SKIPPED: 0 | WARNINGS: 0 | FAILED: 0 ]
✔ | OK F W S | Context
✔ | 11 | binary functions [1.5 s]
✔ | 4 | functions on binary files [1.3 s]
✔ | 2 | broadcast variables [0.3 s]
✔ | 5 | functions in client.R
✔ | 46 | test functions in sparkR.R [4.4 s]
✔ | 2 | include R packages [0.2 s]
✔ | 2 | JVM API [0.1 s]
✔ | 75 | MLlib classification algorithms, except for tree-based
algorithms [55.2 s]
✔ | 70 | MLlib clustering algorithms [23.8 s]
✔ | 6 | MLlib frequent pattern mining [1.8 s]
✔ | 8 | MLlib recommendation algorithms [4.6 s]
✔ | 136 | MLlib regression algorithms, except for tree-based
algorithms [50.6 s]
✔ | 8 | MLlib statistics algorithms [0.4 s]
✔ | 94 | MLlib tree-based algorithms [40.7 s]
✔ | 29 | parallelize() and collect() [0.4 s]
✔ | 428 | basic RDD functions [15.0 s]
✔ | 39 | SerDe functionality [2.2 s]
✔ | 20 | partitionBy, groupByKey, reduceByKey etc. [2.1 s]
✔ | 4 | functions in sparkR.R
✔ | 16 | SparkSQL Arrow optimization [10.5 s]
✔ | 6 | test show SparkDataFrame when eager execution is enabled.
[0.7 s]
✔ | 1177 | SparkSQL functions [106.5 s]
✔ | 42 | Structured Streaming [53.8 s]
✔ | 16 | tests RDD function take() [0.5 s]
✔ | 14 | the textFile() function [1.3 s]
✔ | 46 | functions in utils.R [0.3 s]
✔ | 0 1 | Windows-specific tests
test_Windows.R:22: skip: sparkJars tag in SparkContext
Reason: This test is only for Windows, skipped
══ Results
═
Duration: 378.6 s
OK: 2306
Failed: 0
Warnings: 0
Skipped: 1
```
Thank you so much for this fix, @cloud-fan and @peter-toth .
Merged to master/3.0.
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org