mosche opened a new issue, #23440:
URL: https://github.com/apache/beam/issues/23440

   ### What happened?
   
   The reason is that the entire stack expects to connect to components on 
`localhost`.
   Running Docker on Mac, there's no support for `host` networking. However, 
this would be required to get things working.
   
   Consider this job:
   ```
   python /tmp/app/__main__.py \
         --runner=PortableRunner \
         --job_endpoint=beam-job-server:8099 \
         --artifact_endpoint=beam-job-server:8098 \
         --environment_type=EXTERNAL \
         --environment_config=beam-python-workers:50000
   ```
   
   The Spark job-server was started with `--spark-master-url=spark://spark:7077 
--job-host=beam-job-server`.
   
   I was able to fix connectivity between the job-server and the Spark cluster 
by also setting  `-Dspark.driver.host=beam-job-server` when starting the 
job-server.
   
   Unfortunately connectivity is still broken for the Python SDK worker pool, 
started using `--worker_pool --provision_endpoint=beam-job-server:8099 
--artifact_endpoint=beam-job-server:8098`.
   
   When finally receiving a work request, a worker is started as follows:
   ```
   2022-09-29T10:20:56.811855690Z Starting worker with command 
['/opt/apache/beam/boot', '--id=1-1', '--logging_endpoint=localhost:42343', 
'--artifact_endpoint=localhost:34219', '--provision_endpoint=localhost:40675', 
'--control_endpoint=localhost:46485']
   ```
   
   This way, obviously, the worker isn't able to communicate results back.
   
   
   ### Issue Priority
   
   Priority: 2
   
   ### Issue Component
   
   Component: runner-spark


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to