This is an automated email from the ASF dual-hosted git repository.
jiayu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/sedona.git
The following commit(s) were added to refs/heads/master by this push:
new d42fc51e [DOCS] Compilation process updates (#852)
d42fc51e is described below
commit d42fc51e172d14df6d6205c032ac2af433cbdd23
Author: Nilesh Gajwani <[email protected]>
AuthorDate: Wed Jun 7 10:26:14 2023 -0700
[DOCS] Compilation process updates (#852)
---
docs/setup/compile.md | 11 ++++++-----
mkdocs.yml | 2 +-
2 files changed, 7 insertions(+), 6 deletions(-)
diff --git a/docs/setup/compile.md b/docs/setup/compile.md
index b3ae72b7..bb9fc57d 100644
--- a/docs/setup/compile.md
+++ b/docs/setup/compile.md
@@ -6,7 +6,7 @@
## Compile Scala / Java source code
Sedona Scala/Java code is a project with multiple modules. Each module is a
Scala/Java mixed project which is managed by Apache Maven 3.
-* Make sure your Linux/Mac machine has Java 1.8, Apache Maven 3.3.1+, and
Python3. The compilation of Sedona is not tested on Windows machine.
+* Make sure your Linux/Mac machine has Java 1.8, Apache Maven 3.3.1+, and
Python3.7+. The compilation of Sedona is not tested on Windows machine.
To compile all modules, please make sure you are in the root folder of all
modules. Then enter the following command in the terminal:
@@ -66,9 +66,7 @@ User can specify `-Dspark` and `-Dscala` command line options
to compile with di
Sedona uses GitHub action to automatically generate jars per commit. You can
go [here](https://github.com/apache/sedona/actions/workflows/java.yml) and
download the jars by clicking the commit's ==Artifacts== tag.
## Run Python test
-
1. Set up the environment variable SPARK_HOME and PYTHONPATH
-
For example,
```
export SPARK_HOME=$PWD/spark-3.0.1-bin-hadoop2.7
@@ -76,7 +74,7 @@ export PYTHONPATH=$SPARK_HOME/python
```
2. Compile the Sedona Scala and Java code with `-Dgeotools` and then copy the
==sedona-spark-shaded-{{ sedona.current_version }}.jar== to
==SPARK_HOME/jars/== folder.
```
-cp spark-shaded/target/sedona-spark-shaded-xxx.jar SPARK_HOME/jars/
+cp spark-shaded/target/sedona-spark-shaded-xxx.jar $SPARK_HOME/jars/
```
3. Install the following libraries
```
@@ -86,6 +84,7 @@ sudo pip3 install -U wheel
sudo pip3 install -U virtualenvwrapper
sudo pip3 install -U pipenv
```
+Homebrew can be used to install libgeos-dev in macOS: `brew install geos`
4. Set up pipenv to the desired Python version: 3.7, 3.8, or 3.9
```
cd python
@@ -94,9 +93,11 @@ pipenv --python 3.7
5. Install the PySpark version and other dependency
```
cd python
-pipenv install pyspark==3.0.1
+pipenv install pyspark
pipenv install --dev
```
+`pipenv install pyspark` install the latest version of pyspark.
+In order to remain consistent with installed spark version, use `pipenv
install pyspark==<spark_version>`
6. Run the Python tests
```
cd python
diff --git a/mkdocs.yml b/mkdocs.yml
index 996dc6ae..a9ec3645 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -164,7 +164,7 @@ markdown_extensions:
- pymdownx.tilde
plugins:
- search:
- prebuild_index: true
+ #prebuild_index: true
- macros
- git-revision-date-localized:
type: datetime