Hello list,
One of the Storm workers dies with the following error message:
2017-06-14 11:17:32.503 o.a.s.util [ERROR] Async loop died!
java.lang.OutOfMemoryError: Java heap space
at org.apache.kafka.common.utils.Utils.toArray(Utils.java:272)
~[stormjar.jar:?]
at org.apache.kafka.common.utils.Utils.toArray(Utils.java:265)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:626)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.parseFetchedData(Fetcher.java:548)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:354)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1000)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:938)
~[stormjar.jar:?]
at
org.apache.storm.kafka.spout.KafkaSpout.pollKafkaBroker(KafkaSpout.java:286)
~[stormjar.jar:?]
at
org.apache.storm.kafka.spout.KafkaSpout.nextTuple(KafkaSpout.java:224)
~[stormjar.jar:?]
at
org.apache.storm.daemon.executor$fn__6505$fn__6520$fn__6551.invoke(executor.clj:651)
~[storm-core-1.0.1.2.5.3.0-37.jar:1.0.1.2.5.3.0-37]
at org.apache.storm.util$async_loop$fn__554.invoke(util.clj:484)
[storm-core-1.0.1.2.5.3.0-37.jar:1.0.1.2.5.3.0-37]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_77]
2017-06-14 11:17:40.058 o.a.s.d.executor [ERROR]
java.lang.OutOfMemoryError: Java heap space
at org.apache.kafka.common.utils.Utils.toArray(Utils.java:272)
~[stormjar.jar:?]
at org.apache.kafka.common.utils.Utils.toArray(Utils.java:265)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:626)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.parseFetchedData(Fetcher.java:548)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:354)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1000)
~[stormjar.jar:?]
at
org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:938)
~[stormjar.jar:?]
at
org.apache.storm.kafka.spout.KafkaSpout.pollKafkaBroker(KafkaSpout.java:286)
~[stormjar.jar:?]
at
org.apache.storm.kafka.spout.KafkaSpout.nextTuple(KafkaSpout.java:224)
~[stormjar.jar:?]
at
org.apache.storm.daemon.executor$fn__6505$fn__6520$fn__6551.invoke(executor.clj:651)
~[storm-core-1.0.1.2.5.3.0-37.jar:1.0.1.2.5.3.0-37]
at org.apache.storm.util$async_loop$fn__554.invoke(util.clj:484)
[storm-core-1.0.1.2.5.3.0-37.jar:1.0.1.2.5.3.0-37]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_77]
2017-06-14 11:18:08.898 o.a.s.util [ERROR] Halting process: ("Worker
died")
As far as I can see, it's automatically restarted, but then dies again.
Might this be due to only having 64MB of memory available:
2017-06-14 11:06:35.034 o.a.s.d.supervisor [INFO] Launching worker with
command: '/usr/jdk64/jdk1.8.0_77/bin/java' '-cp'
'/usr/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/usr/hdp/2.5.3.0-37/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/current/storm-supervisor/conf:/data1/hadoop/storm/supervisor
/stormdist/indexing-5-1497457977/stormjar.jar:/etc/hbase/conf:/etc/hadoop/conf'
'-Xmx64m' '-Dlogfile.name=worker.log'
'-Dstorm.home=/usr/hdp/2.5.3.0-37/storm'
'-Dworkers.artifacts=/var/log/storm/workers-artifacts'
'-Dstorm.id=indexing-5-1497457977'
'-Dworker.id=98d9fa65-b025-4924-9d4a-8f77324f9cf2' '-Dworker.port=6705'
'-Dstorm.log.dir=/var/log/storm'
'-Dlog4j.configurationFile=/usr/hdp/2.5.3.0-37/storm/log4j2/worker.xml'
'-DLog4jContextSelector=org.apache.logging.log4j.core.selector.BasicContextSelector'
'org.apache.storm.LogWriter' '/usr/jdk64/jdk1.8.0_77/bin/java' '-server'
'-Xmx1024m'
'-javaagent:/usr/hdp/current/storm-client/contrib/storm-jmxetric/lib/jmxetric-1.0.4.jar=host=localhost,port=8650,wireformat31x=true,mode=multicast,config=/usr/hdp/current/storm-client/contrib/storm-jmxetric/conf/jmxetric-conf.xml,process=Worker_6705_JVM'
'-Djava.library.path=/data1/hadoop/storm/supervisor/stormdist/indexing-5-1497457977/resources/Linux-amd64:/data1/hadoop/storm/supervisor/stormdist/indexing-5-1497457977/resources:/usr/local/lib:/opt/local/lib:/usr/lib:/usr/hdp/current/storm-client/lib'
'-Dlogfile.name=worker.log' '-Dstorm.home=/usr/hdp/2.5.3.0-37/storm'
'-Dworkers.artifacts=/var/log/storm/workers-artifacts'
'-Dstorm.conf.file=' '-Dstorm.options=' '-Dstorm.log.dir=/var/log/storm'
'-Djava.io.tmpdir=/data1/hadoop/storm/workers/98d9fa65-b025-4924-9d4a-8f77324f9cf2/tmp'
'-Dlogging.sensitivity=S3'
'-Dlog4j.configurationFile=/usr/hdp/2.5.3.0-37/storm/log4j2/worker.xml'
'-DLog4jContextSelector=org.apache.logging.log4j.core.selector.BasicContextSelector'
'-Dstorm.id=indexing-5-1497457977'
'-Dworker.id=98d9fa65-b025-4924-9d4a-8f77324f9cf2' '-Dworker.port=6705'
'-cp'
'/usr/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/usr/hdp/2.5.3.0-37/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/current/storm-supervisor/conf:/data1/hadoop/storm/supervisor
/stormdist/indexing-5-1497457977/stormjar.jar:/etc/hbase/conf:/etc/hadoop/conf'
'org.apache.storm.daemon.worker' 'indexing-5-1497457977'
'9c90251f-54cc-42d7-8c4e-6f3bb2528752' '6705'
'98d9fa65-b025-4924-9d4a-8f77324f9cf2'
Any ideas on what might be happening?