Hi Nagatomi,
Use Apache imagers, then run your master node, then start your many
slavers. You can add a command line in the docker files to call for the
master using the docker container names in your service composition if you
wish to run 2 masters active and standby follow the instructions in the
Apache docs to do this configuration, the recipe is the same except when
you start the masters and how you expect the behaviour of your cluster.
I hope it helps.
Have a nice day :)
Cley

Nagatomi Yasukazu <yassan0...@gmail.com> escreveu no dia sábado, 2/09/2023
à(s) 15:37:

> Hello Apache Spark community,
>
> I'm currently trying to run Spark Connect Server on Kubernetes in Cluster
> Mode and facing some challenges. Any guidance or hints would be greatly
> appreciated.
>
> ## Environment:
> Apache Spark version: 3.4.1
> Kubernetes version:  1.23
> Command executed:
>  /opt/spark/sbin/start-connect-server.sh \
>    --packages
> org.apache.spark:spark-connect_2.13:3.4.1,org.apache.iceberg:iceberg-spark-runtime-3.4_2.13:1.3.1...
> Note that I'm running it with the environment variable
> SPARK_NO_DAEMONIZE=1.
>
> ## Issue:
> When I connect from an external Python client and run scripts, it operates
> in Local Mode instead of the expected Cluster Mode.
>
> ## Expected Behavior:
> When connecting from a Python client to the Spark Connect Server, I expect
> it to run in Cluster Mode.
>
> If anyone has any insights, advice, or has faced a similar issue, I'd be
> grateful for your feedback.
> Thank you in advance.
>
>
>

Reply via email to