HyukjinKwon commented on a change in pull request #30059:
URL: https://github.com/apache/spark/pull/30059#discussion_r505936661
##########
File path: .github/workflows/build_and_test.yml
##########
@@ -201,6 +173,92 @@ jobs:
name: unit-tests-log-${{ matrix.modules }}-${{ matrix.comment }}-${{
matrix.java }}-${{ matrix.hadoop }}-${{ matrix.hive }}
path: "**/target/unit-tests.log"
+ pyspark:
+ name: "Build modules: ${{ matrix.modules }}"
+ runs-on: ubuntu-20.04
+ container:
+ image: dongjoon/apache-spark-github-action-image:20201015
+ strategy:
+ fail-fast: false
+ matrix:
+ modules:
+ - >-
+ pyspark-sql, pyspark-mllib, pyspark-resource
+ - >-
+ pyspark-core, pyspark-streaming, pyspark-ml
+ env:
+ MODULES_TO_TEST: ${{ matrix.modules }}
+ HADOOP_PROFILE: hadoop3.2
+ HIVE_PROFILE: hive2.3
+ # GitHub Actions' default miniconda to use in pip packaging test.
+ CONDA_PREFIX: /usr/share/miniconda
+ GITHUB_PREV_SHA: ${{ github.event.before }}
+ GITHUB_INPUT_BRANCH: ${{ github.event.inputs.target }}
+ steps:
+ - name: Checkout Spark repository
+ uses: actions/checkout@v2
+ # In order to fetch changed files
+ with:
+ fetch-depth: 0
+ - name: Merge dispatched input branch
+ if: ${{ github.event.inputs.target != '' }}
+ run: git merge --progress --ff-only origin/${{
github.event.inputs.target }}
+ # Cache local repositories. Note that GitHub Actions cache has a 2G limit.
+ - name: Cache Scala, SBT, Maven and Zinc
+ uses: actions/cache@v2
+ with:
+ path: |
+ build/apache-maven-*
+ build/zinc-*
+ build/scala-*
+ build/*.jar
+ key: build-${{ hashFiles('**/pom.xml', 'project/build.properties',
'build/mvn', 'build/sbt', 'build/sbt-launch-lib.bash',
'build/spark-build-info') }}
+ restore-keys: |
+ build-
+ - name: Cache Maven local repository
+ uses: actions/cache@v2
+ with:
+ path: ~/.m2/repository
+ key: pyspark-maven-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ pyspark-maven-
+ - name: Cache Ivy local repository
+ uses: actions/cache@v2
+ with:
+ path: ~/.ivy2/cache
+ key: pyspark-ivy-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
+ restore-keys: |
+ pyspark-ivy-
+ - name: Install Python 3.6
+ uses: actions/setup-python@v2
+ if: contains(matrix.modules, 'pyspark')
+ with:
+ python-version: 3.6
+ architecture: x64
+ - name: Install Python packages (Python 3.6)
Review comment:
I mean, I got that it takes less time but wondering if we can just
pre-install it to make it look consistent.
BTW, it's interesting that installing `numpy pyarrow pandas scipy xmlrunner`
only takes 20s seconds ..
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]