This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.1 by this push:
     new 62f6761  [SPARK-36040][DOCS][K8S] Add reference to kubernetes-client's 
version
62f6761 is described below

commit 62f6761883f855ec97fbc0c69a7da3b0db7f4170
Author: yoda-mon <yo...@oss.nttdata.com>
AuthorDate: Sun Jul 18 14:26:15 2021 -0700

    [SPARK-36040][DOCS][K8S] Add reference to kubernetes-client's version
    
    ### What changes were proposed in this pull request?
    
    Add reference to kubernetes-client's version
    
    ### Why are the changes needed?
    
    Running Spark on Kubernetes potentially has upper limitation of Kubernetes 
version.
    I think it is better for users to notice it because Kubernetes update speed 
is so fast that users tends to run Spark Jobs on unsupported version.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No
    
    ### How was this patch tested?
    
    SKIP_API=1 bundle exec jekyll build
    
    Closes #33255 from yoda-mon/add-reference-kubernetes-client.
    
    Authored-by: yoda-mon <yo...@oss.nttdata.com>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
    (cherry picked from commit eea69c122f20577956c4a87a6d8eb59943c1c6f0)
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 docs/running-on-kubernetes.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/running-on-kubernetes.md b/docs/running-on-kubernetes.md
index b9a018a..125c952 100644
--- a/docs/running-on-kubernetes.md
+++ b/docs/running-on-kubernetes.md
@@ -51,6 +51,7 @@ you may set up a test cluster on your local machine using
   * Be aware that the default minikube configuration is not enough for running 
Spark applications.
   We recommend 3 CPUs and 4g of memory to be able to start a simple Spark 
application with a single
   executor.
+  * Check [kubernetes-client 
library](https://github.com/fabric8io/kubernetes-client)'s version of your 
Spark environment, and its compatibility with your Kubernetes cluster's version.
 * You must have appropriate permissions to list, create, edit and delete
 [pods](https://kubernetes.io/docs/user-guide/pods/) in your cluster. You can 
verify that you can list these resources
 by running `kubectl auth can-i <list|create|edit|delete> pods`.

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to