This is an automated email from the ASF dual-hosted git repository.

mgrund pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark-connect-go.git


The following commit(s) were added to refs/heads/master by this push:
     new 3f98d9c  [DOC] Improve contributor guide
3f98d9c is described below

commit 3f98d9c640451a1eb308f6c2b101975229ce95cb
Author: Alex Ott <[email protected]>
AuthorDate: Mon Dec 30 09:54:24 2024 +0100

    [DOC] Improve contributor guide
    
    ### What changes were proposed in this pull request?
    
    Changes:
    
    - Add links to the tools needed for development, so people know how to 
install them
    - Clarified about the need of `SPARK_HOME` to run integration tests
    
    ### Why are the changes needed?
    
    ### Does this PR introduce _any_ user-facing change?
    
    ### How was this patch tested?
    
    Closes #91 from alexott/contributor-guide-mention-spark-home.
    
    Authored-by: Alex Ott <[email protected]>
    Signed-off-by: Martin Grund <[email protected]>
---
 CONTRIBUTING.md | 13 ++++++++-----
 1 file changed, 8 insertions(+), 5 deletions(-)

diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index e228fbf..3448f7c 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -25,11 +25,11 @@ a consitent style and adherence to license rules. You can 
run these checks local
 make check
 ```
 
-This requires the following tools to be present in your PATH:
+This requires the following tools to be present in your `PATH`:
 
 1. Java for checking license headers
-2. `gofumpt` for formatting Go code
-3. `golangci-lint` for linting Go code
+2. [gofumpt](https://github.com/mvdan/gofumpt) for formatting Go code
+3. [golangci-lint](https://golangci-lint.run/) for linting Go code
 
 ### Running Tests
 
@@ -39,13 +39,16 @@ To run the tests locally, you can run:
 make test
 ```
 
-This will run the unit tests. If you want to run the integration tests, you 
can run:
+This will run the unit tests. If you want to run the integration tests, you 
can run (you
+need to set environment variable `SPARK_HOME` pointing to existing directory 
with unpacked
+Apache Spark 3.5+ distribution):
 
 ```bash
 make integration
 ```
 
-Lastly, if you want to run all tests and generate the coverage analysis, you 
can run:
+Lastly, if you want to run all tests (unit and integration) and generate the 
coverage
+analysis, you can run:
 
 ```bash
 make fulltest


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to