[ 
https://issues.apache.org/jira/browse/SPARK-31918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17142566#comment-17142566
 ] 

Hyukjin Kwon commented on SPARK-31918:
--------------------------------------

Nice, [~shivaram].

I just quickly tested, and the first option is not working.

1. Build Spark 3.0.0 in R 4.0.1 and install it from source with R 3.4.0 in 
another machine:

{code}
install.packages("SparkR_3.0.0.tar.gz", repos = NULL, type = "source")
{code}

{code}
df <- createDataFrame(lapply(seq(100), function (e) list(value=e)))
count(dapply(df, function(x) as.data.frame(x[x$value < 50,]), schema(df)))
{code}

It shows the same error as shown in 
https://cran.r-project.org/web/checks/check_results_SparkR.html


2. Build Spark 3.0.0 in R 4.0.1, loads the library directly with R 3.4.0 in 
another machine:

{code}
library(SparkR, lib.loc = c(file.path("~/spark-3.0.0-bin-hadoop2.7", "R", 
"lib")))
{code}

{code}
# this error message is translated from another language. My R in Mac is in 
Korean
Error listing packages, Error in readRDS(pfile): cannot read workspace version 
3 written by R 4.0.1. R version should be 3.5+
{code}


3. Download Spark 3.0.0 release, loads the library directly with R 3.4.0 in 
another machine:

{code}
library(SparkR, lib.loc = c(file.path("~/spark-3.0.0-bin-hadoop2.7", "R", 
"lib")))
{code}

{code}
# this error message is translated from another language. My R in Mac is in 
Korean
Error listing packages, Error in readRDS(pfile): cannot read workspace version 
3 written by R 3.6.3. R version should be 3.5+
{code}


> SparkR CRAN check gives a warning with R 4.0.0 on OSX
> -----------------------------------------------------
>
>                 Key: SPARK-31918
>                 URL: https://issues.apache.org/jira/browse/SPARK-31918
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.4.6, 3.0.0
>            Reporter: Shivaram Venkataraman
>            Priority: Blocker
>
> When the SparkR package is run through a CRAN check (i.e. with something like 
> R CMD check --as-cran ~/Downloads/SparkR_2.4.6.tar.gz), we rebuild the SparkR 
> vignette as a part of the checks.
> However this seems to be failing with R 4.0.0 on OSX -- both on my local 
> machine and on CRAN 
> https://cran.r-project.org/web/checks/check_results_SparkR.html
> cc [~felixcheung]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to