HyukjinKwon commented on a change in pull request #30066:
URL: https://github.com/apache/spark/pull/30066#discussion_r511717010
##########
File path: .github/workflows/build_and_test.yml
##########
@@ -138,20 +136,6 @@ jobs:
run: |
python3.8 -m pip install numpy 'pyarrow<3.0.0' pandas scipy xmlrunner
python3.8 -m pip list
- # SparkR
- - name: Install R 4.0
- uses: r-lib/actions/setup-r@v1
- if: contains(matrix.modules, 'sparkr')
- with:
- r-version: 4.0
- - name: Install R packages
- if: contains(matrix.modules, 'sparkr')
- run: |
- # qpdf is required to reduce the size of PDFs to make CRAN check pass.
See SPARK-32497.
- sudo apt-get install -y libcurl4-openssl-dev qpdf
- sudo Rscript -e "install.packages(c('knitr', 'rmarkdown', 'testthat',
'devtools', 'e1071', 'survival', 'arrow', 'roxygen2'),
repos='https://cloud.r-project.org/')"
- # Show installed packages in R.
- sudo Rscript -e 'pkg_list <- as.data.frame(installed.packages()[,
c(1,3:4)]); pkg_list[is.na(pkg_list$Priority), 1:2, drop = FALSE]'
Review comment:
FWIW .. we might have to list up what package versions are installed in
the image for PySpark and SparkR once we have the official image ... running
this line and `pip list` should show what package are installed for both
PySpark and SparkR ..
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]