IsisPolei opened a new issue, #6720:
URL: https://github.com/apache/hudi/issues/6720
hudi:0.10.1
spark:3.1.3_scala2.12
background story:
I use SparkRDDWriteClient to process hudi , both app and spark standalone
cluster are running in docker. When the app and spark cluster container running
in the same local machine, my app work well. But when i deploy the spark
cluster in different machine i got a series of connection problems.
machineA(192.168.64.107): spark driver(SparkRDDWriteClient app)
machineB(192.168.64.121):spark standalone cluster(master and worker running
in two containers)
Due to the spark network connection mechanism, i have set the connect parms
below:
spark.master.url: spark://192.168.64.121:7077
spark.driver.bindAddress: 0.0.0.0
spark.driver.host: 192.168.64.107
spark.driver.port: 10000
The HoodieSparkContext init correctly and i can see the spark job running in
the spark web UI. But when the code reach to sparkRDDWriteClient.upsert(), this
exception occur:
Caused by: org.apache.hudi.exception.HoodieRemoteException: Connect to
192.168.64.107:34446 [/192.168.64.107] failed: Connection refused (Connection
refused) at
org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.refresh(RemoteHoodieTableFileSystemView.java:420)
at
org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.sync(RemoteHoodieTableFileSystemView.java:484)
at
org.apache.hudi.common.table.view.PriorityBasedFileSystemView.sync(PriorityBasedFileSystemView.java:257)
at
org.apache.hudi.client.SparkRDDWriteClient.getTableAndInitCtx(SparkRDDWriteClient.java:493)
at
org.apache.hudi.client.SparkRDDWriteClient.getTableAndInitCtx(SparkRDDWriteClient.java:448)
at
org.apache.hudi.client.SparkRDDWriteClient.upsert(SparkRDDWriteClient.java:157)
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to
192.168.64.107:34446 [/192.168.64.107] failed: Connection refused (Connection
refused)
at
org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:156)
at
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
at
org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
at
org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at
org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at
org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at
org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at
org.apache.http.client.fluent.Request.internalExecute(Request.java:173)
at org.apache.http.client.fluent.Request.execute(Request.java:177)
at org.apache.hudi.common.table.view
It seems like these two container can't connect to each other through the
hoodie.filesystem.view.remote.port. So i expose this port of my app container
but it doesn't work. Please tell me what i did wrong.
These are my docker-compose.yml:
app:
app:
image: xxx
container_name: app
ports:
- "5008:5005"
- "10000:10000"
- "10001:10001"
spark:
version: '3'
services:
master:
image: bitnami/spark:3.1
container_name: master
hostname: master
environment:
MASTER: spark://master:7077
restart: always
ports:
- "7077:7077"
- "9080:8080"
worker:
image: bitnami/spark:3.1
container_name: worker
restart: always
environment:
SPARK_WORKER_CORES: 5
SPARK_WORKER_MEMORY: 2g
SPARK_WORKER_PORT: 8881
depends_on:
- master
links:
- master
ports:
- "8081:8081"
expose:
- "8881"
I hope i describe the situation clearly, please help.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]