shaneknapp commented on issue #27460: [SPARK-30733][R][HOTFIX] Fix SparkR tests per testthat and R version upgrade, and disable CRAN URL: https://github.com/apache/spark/pull/27460#issuecomment-582562368 > @shaneknapp to summarize what happen so far: > > * Seems R version itself bumped up correctly to 3.5.2; however, seems we should reinstall the packages installed previously, including `r-base`. It caused some problems with error messages such as the below (please refer the PR description). yeah, upgrading `testthat` lead to my worst nightmare: the game of R dependency whack-a-mole. i'm currently continuing playing right now and will continue to update packages i find. > Currently, I fixed it by making tests permissive and just skipping CRAN check but we should at least reenable CRAN back. Can we reinstall the packages previously installed? It should be able to test via manually calling `./R/check-cran.sh` or reverting the changes made in this PR at `R/run-tests.sh `. sounds good. i'm testing manually on a centos worker and will update this PR when i think i've got everything. > * Looks like Arrow R library was unable to find after the upgrade: > ``` > test_sparkSQL_arrow.R:25: skip: createDataFrame/collect Arrow optimization > arrow cannot be loaded > ``` > interesting. i'll install that too. > * I think this is minor but seems there are some enviornment issues assuming from: > ``` > test_sparkSQL.R:499: warning: SPARK-17811: can create DataFrame containing NA as date and time > Your system is mis-configured: ���/etc/localtime��� is not a symlink > ``` > that is definitely odd. will investigate. have i mentioned how much i hate touching R? :)
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
