[ 
https://issues.apache.org/jira/browse/SPARK-31918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17142657#comment-17142657
 ] 

Hyukjin Kwon commented on SPARK-31918:
--------------------------------------

With SparkR built by R 4.0.1 on R 3.6.3  as is, tests pass with one test 
failure, which I think it's not a big deal:

{code}
Warning message:
package ‘SparkR’ was built under R version 4.0.1
Spark package found in SPARK_HOME: /.../spark
══ testthat results  ═══════════════════════════════════════════════════════════
[ OK: 13 | SKIPPED: 0 | WARNINGS: 0 | FAILED: 0 ]
✔ |  OK F W S | Context
✔ |  11       | binary functions [3.7 s]
✔ |   4       | functions on binary files [3.7 s]
✔ |   2       | broadcast variables [0.8 s]
✔ |   5       | functions in client.R
✔ |  46       | test functions in sparkR.R [10.1 s]
✔ |   2       | include R packages [0.5 s]
✔ |   2       | JVM API [0.3 s]
✔ |  70       | MLlib classification algorithms, except for tree-based 
algorithms [93.1 s]
✔ |  70       | MLlib clustering algorithms [38.8 s]
✔ |   6       | MLlib frequent pattern mining [3.0 s]
✔ |   8       | MLlib recommendation algorithms [9.9 s]
✔ | 128       | MLlib regression algorithms, except for tree-based algorithms 
[63.9 s]
✔ |   8       | MLlib statistics algorithms [0.5 s]
✔ |  94       | MLlib tree-based algorithms [81.2 s]
✔ |  29       | parallelize() and collect() [0.5 s]
✔ | 428       | basic RDD functions [21.1 s]
✔ |  39       | SerDe functionality [2.1 s]
✔ |  20       | partitionBy, groupByKey, reduceByKey etc. [3.3 s]
✔ |   4       | functions in sparkR.R
✔ |  16       | SparkSQL Arrow optimization [20.3 s]
✔ |   6       | test show SparkDataFrame when eager execution is enabled. [1.3 
s]
✖ | 1172 1     | SparkSQL functions [156.4 s]
────────────────────────────────────────────────────────────────────────────────
test_sparkSQL.R:2719: error: mutate(), transform(), rename() and names()
could not find function "deparse1"
Backtrace:
 1. base::attach(airquality) tests/fulltests/test_sparkSQL.R:2719:2
 2. base::attach(airquality)
────────────────────────────────────────────────────────────────────────────────
✔ |  42       | Structured Streaming [520.2 s]
✔ |  16       | tests RDD function take() [0.9 s]
✔ |  14       | the textFile() function [2.6 s]
✔ |  46       | functions in utils.R [0.5 s]
✔ |   0     1 | Windows-specific tests
────────────────────────────────────────────────────────────────────────────────
test_Windows.R:22: skip: sparkJars tag in SparkContext
Reason: This test is only for Windows, skipped
────────────────────────────────────────────────────────────────────────────────

══ Results ═════════════════════════════════════════════════════════════════════
Duration: 1039.0 s
{code}

Seems like the test failure is due to missing {{deparse1}} which was added from 
R 4.0.0. I think we can just guide people to use 
https://github.com/r-lib/backports if this is an issue.
The test case itself doesn't look a big deal.

I will take a closer look to make it working in R 4.0.0.

> SparkR CRAN check gives a warning with R 4.0.0 on OSX
> -----------------------------------------------------
>
>                 Key: SPARK-31918
>                 URL: https://issues.apache.org/jira/browse/SPARK-31918
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.4.6, 3.0.0
>            Reporter: Shivaram Venkataraman
>            Priority: Blocker
>
> When the SparkR package is run through a CRAN check (i.e. with something like 
> R CMD check --as-cran ~/Downloads/SparkR_2.4.6.tar.gz), we rebuild the SparkR 
> vignette as a part of the checks.
> However this seems to be failing with R 4.0.0 on OSX -- both on my local 
> machine and on CRAN 
> https://cran.r-project.org/web/checks/check_results_SparkR.html
> cc [~felixcheung]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to