GitHub user shivaram opened a pull request:
https://github.com/apache/spark/pull/17966
[SPARK-20666] Skip tests that use Hadoop utils on CRAN Windows
## What changes were proposed in this pull request?
This change skips tests that use the Hadoop libraries while running
on CRAN check with Windows as the operating system. This is to handle
cases where the Hadoop winutils binaries are missing on the target
system. The skipped tests consist of
1. Tests that save, load a model in MLlib
2. Tests that save, load CSV, JSON and Parquet files in SQL
3. Hive tests
## How was this patch tested?
Tested by running on a local windows VM with HADOOP_HOME unset. Also
testing with https://win-builder.r-project.org
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/shivaram/spark-1 sparkr-windows-cran
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/17966.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #17966
----
commit 40273ab7defbbee149a5f26aa523c6b9010a7c4f
Author: Shivaram Venkataraman <[email protected]>
Date: 2017-05-12T17:41:14Z
Skip tests that use Hadoop utils on CRAN Windows
This change skips tests that use the Hadoop libraries while running
on CRAN check with Windows as the operating system. This is to handle
cases where the Hadoop winutils binaries are missing on the target
system. The skipped tests consist of
1. Tests that save, load a model in MLlib
2. Tests that save, load CSV, JSON and Parquet files in SQL
3. Hive tests
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]