[
https://issues.apache.org/jira/browse/SPARK-30733?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon reassigned SPARK-30733:
------------------------------------
Assignee: Hyukjin Kwon
> Fix SparkR tests per testthat and R version upgrade
> ---------------------------------------------------
>
> Key: SPARK-30733
> URL: https://issues.apache.org/jira/browse/SPARK-30733
> Project: Spark
> Issue Type: Test
> Components: SparkR, SQL
> Affects Versions: 2.4.5, 3.0.0, 3.1.0
> Reporter: Hyukjin Kwon
> Assignee: Hyukjin Kwon
> Priority: Critical
> Fix For: 2.4.6, 3.0.0, 3.1.0
>
>
> 5 SparkR tests seem being failed after upgrading testthat 2.0.0 and R 3.5.x
> {code}
> test_context.R:49: failure: Check masked functions
> length(maskedCompletely) not equal to length(namesOfMaskedCompletely).
> 1/1 mismatches
> [1] 6 - 4 == 2
> test_context.R:53: failure: Check masked functions
> sort(maskedCompletely, na.last = TRUE) not equal to
> sort(namesOfMaskedCompletely, na.last = TRUE).
> 5/6 mismatches
> x[2]: "endsWith"
> y[2]: "filter"
> x[3]: "filter"
> y[3]: "not"
> x[4]: "not"
> y[4]: "sample"
> x[5]: "sample"
> y[5]: NA
> x[6]: "startsWith"
> y[6]: NA
> {code}
> {code}
> test_includePackage.R:31: error: include inside function
> package or namespace load failed for ���plyr���:
> package ���plyr��� was installed by an R version with different internals;
> it needs to be reinstalled for use with this R version
> Seems it's a package installation issue. Looks like plyr has to be
> re-installed.
> {code}
> {code}
> test_sparkSQL.R:499: warning: SPARK-17811: can create DataFrame containing NA
> as date and time
> Your system is mis-configured: ���/etc/localtime��� is not a symlink
> test_sparkSQL.R:504: warning: SPARK-17811: can create DataFrame containing NA
> as date and time
> Your system is mis-configured: ���/etc/localtime��� is not a symlink
> {code}
> {code}
> test_sparkSQL.R:499: warning: SPARK-17811: can create DataFrame containing NA
> as date and time
> It is strongly recommended to set envionment variable TZ to
> ���America/Los_Angeles��� (or equivalent)
> test_sparkSQL.R:504: warning: SPARK-17811: can create DataFrame containing NA
> as date and time
> It is strongly recommended to set envionment variable TZ to
> ���America/Los_Angeles��� (or equivalent
> {code}
> {code}
> test_sparkSQL.R:1814: error: string operators
> unable to find an inherited method for function ���startsWith��� for
> signature ���"character"���
> 1: expect_true(startsWith("Hello World", "Hello")) at
> /home/jenkins/workspace/SparkPullRequestBuilder@2/R/pkg/tests/fulltests/test_sparkSQL.R:1814
> 2: quasi_label(enquo(object), label)
> 3: eval_bare(get_expr(quo), get_env(quo))
> 4: startsWith("Hello World", "Hello")
> 5: (function (classes, fdef, mtable)
> {
> methods <- .findInheritedMethods(classes, fdef, mtable)
> if (length(methods) == 1L)
> return(methods[[1L]])
> else if (length(methods) == 0L) {
> cnames <- paste0("\"", vapply(classes, as.character, ""), "\"",
> collapse = ", ")
> stop(gettextf("unable to find an inherited method for function %s
> for signature %s",
> sQuote(fdef@generic), sQuote(cnames)), domain = NA)
> }
> else stop("Internal error in finding inherited methods; didn't return
> a unique method",
> domain = NA)
> })(list("character"), new("nonstandardGenericFunction", .Data = function
> (x, prefix)
> {
> standardGeneric("startsWith")
> }, generic = structure("startsWith", package = "SparkR"), package =
> "SparkR", group = list(),
> valueClass = character(0), signature = c("x", "prefix"), default =
> NULL, skeleton = (function (x,
> prefix)
> stop("invalid call in method dispatch to 'startsWith' (no default
> method)", domain = NA))(x,
> prefix)), <environment>)
> 6: stop(gettextf("unable to find an inherited method for function %s for
> signature %s",
> sQuote(fdef@generic), sQuote(cnames)), domain = NA)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]