Aalron commented on issue #4985:
URL: https://github.com/apache/hudi/issues/4985#issuecomment-1063993967


   @xushiyan @codope 
   i found kafka setting in 
`<HUDI_REPO>/docker/compose/docker-compose_hadoop284_hive233_spark244.yml` need 
add 
   an environment setting 
   
   ```
      kafka:
       image: 'wurstmeister/kafka:2.12-2.0.1'
       platform: linux/arm64
       hostname: kafkabroker
       container_name: kafkabroker
       ports:
         - '9092:9092'
       environment:
         - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
         - ALLOW_PLAINTEXT_LISTENER=yes
         - KAFKA_ADVERTISED_HOST_NAME=kafkabroker
   ```
   after that, i found four exception 
   
   > **First** from historyserver( apachehudi/hudi-hadoop_2.8.4-history:latest) 
images:
   ```
   22/03/10 10:01:38 FATAL applicationhistoryservice.ApplicationHistoryServer: 
Error starting ApplicationHistoryServer
   java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no 
leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in java.library.path, 
no leveldbjni in java.library.path, 
/tmp/libleveldbjni-64-1-2530759744317816554.8: 
/tmp/libleveldbjni-64-1-2530759744317816554.8: cannot open shared object file: 
No such file or directory (Possible cause: can't load AMD 64-bit .so on a 
AARCH64-bit platform)]
   at org.fusesource.hawtjni.runtime.Library.doLoad(Library.java:182)
   at org.fusesource.hawtjni.runtime.Library.load(Library.java:140)
   at org.fusesource.leveldbjni.JniDBFactory.<clinit>(JniDBFactory.java:48)
   at 
org.apache.hadoop.yarn.server.timeline.LeveldbTimelineStore.serviceInit(LeveldbTimelineStore.java:227)
   at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
   at 
org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
   at 
org.apache.hadoop.yarn.server.applicationhistoryservice.ApplicationHistoryServer.serviceInit(ApplicationHistoryServer.java:115)
   at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
   at 
org.apache.hadoop.yarn.server.applicationhistoryservice.ApplicationHistoryServer.launchAppHistoryServer(ApplicationHistoryServer.java:180)
   at 
org.apache.hadoop.yarn.server.applicationhistoryservice.ApplicationHistoryServer.main(ApplicationHistoryServer.java:190)
   ```
   
   > **Second** : 
presto-coordinator-1(apachehudi/hudi-hadoop_2.8.4-prestobase_0.217:latest) image
   ```
   Presto requires amd64 or ppc64le on Linux (found aarch64)
   ```
   
   > **Third** : 
presto-worker-1(apachehudi/hudi-hadoop_2.8.4-prestobase_0.217:latest) image
   ```
   Presto requires amd64 or ppc64le on Linux (found aarch64)
   ```
   > **Fourth**  
spark-worker-1(apachehudi/hudi-hadoop_2.8.4-hive_2.3.3-sparkworker_2.4.4:latest)
 image
   
   ```
   22/03/10 10:12:06 WARN worker.Worker: Failed to connect to master 
sparkmaster:7077
   org.apache.spark.SparkException: Exception thrown in awaitResult: 
   at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
   at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
   at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
   at 
org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deploy$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:253)
   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   at java.lang.Thread.run(Thread.java:748)
   Caused by: java.io.IOException: Failed to connect to 
sparkmaster/172.18.0.10:7077
   at 
org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245)
   at 
org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187)
   at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:198)
   at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194)
   at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190)
   ... 4 more
   Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: 
Connection refused: sparkmaster/172.18.0.10:7077
   at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
   at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
   at 
io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:323)
   at 
io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340)
   at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633)
   at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
   at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
   at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
   at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
   at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
   ... 1 more
   Caused by: java.net.ConnectException: Connection refused
   ... 11 more
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to