Hello! I'm having some problems configuring the python SDK harness when leaving 
from the "LOOPBACK" configuration with the Spark Runner. When using environment 
type = "EXTERNAL" I'm getting no configuration to the endpoints:
` Starting worker with command ['/opt/apache/beam/boot', '--id=1-1', 
'--logging_endpoint=', '--artifact_endpoint=', '--provision_endpoint=', 
'--control_endpoint='] `


Someone would know if it's a bug or a missing configuration and its solution or 
workaround? A full description is done in this[1] question on SO.
Thank you!

[1] 
https://stackoverflow.com/questions/66498209/how-to-configure-beam-python-sdk-with-spark-in-a-kubernetes-environment
[https://cdn.sstatic.net/Sites/stackoverflow/Img/apple-touch-i...@2.png?v=73d79a89bded]<https://stackoverflow.com/questions/66498209/how-to-configure-beam-python-sdk-with-spark-in-a-kubernetes-environment>
java - How to configure beam python sdk with spark in a kubernetes environment 
- Stack 
Overflow<https://stackoverflow.com/questions/66498209/how-to-configure-beam-python-sdk-with-spark-in-a-kubernetes-environment>
TLDR; How to configure Apache Beam pipelines options with "environment_type" = 
EXTERNAL or PROCESS? Description. Currently, we have a standalone spark cluster 
inside Kubernetes, following this solution (and the setup) we launch a beam 
pipeline creating an embedded spark job server on the spark worker who needs to 
run a python SDK jointly. Apache Beam allows running python SDK in 4 different 
ways:
stackoverflow.com


Reply via email to