rahulkg31 opened a new issue, #13931:
URL: https://github.com/apache/druid/issues/13931

   Hi,
   
   We have setup a druid cluster with 1 Master Node, 1 Query Node and 2 Data 
Nodes. We are using HDFS system for deep storage. The setup is working fine in 
normal scenario, but we are getting the following issue when ingesting the data 
from Kafka - 
   `Terminating due to java.lang.OutOfMemoryError: Requested array size exceeds 
VM limit`
   
   I tried all the memory related configurations in jvm.config and 
runtime.properties for druid but still no luck. Can this issue be related to  
HDFS file system which is being used as deep storage for shared data? Also can 
anybody provide some info like where to look further to resolve this?
   
   Thanks,
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to