Hello, I have a structured streaming job that consumes messages from kafka and does some stateful associations using flatMapGroupWithState. Every time I submit the job, it runs fine for around 2hours and then stops abruptly without any error messages. All I can see in the debug logs is the below message, which seems to be the starting point that caused the app to shut down.
"INFO SparkContext: Invoking stop() from shutdown hook" Apart from this message I do not see any exceptions/errors which lead to the spark context being stopped. Also, I have been monitoring the JMX metrics of the app using graphite/prometheus exporter. At any point of time, the memory used never goes beyond the available memory. Can anyone suggest what could be the issue here. Also, here is the log4j configuration I am using, does this capture all types of errors or am I missing something. log4j.rootCategory=DEBUG, file log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss.SSS} %p %c{1}: %m%n log4j.appender.file=org.apache.log4j.RollingFileAppender log4j.appender.file.File=/path/to/app.log log4j.appender.serverAccess.DatePattern='.'yyyy-MM-dd log4j.appender.file.MaxFileSize=10MB log4j.appender.file.MaxBackupIndex=5 log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss.SSS} %p %c{1}: %m%n log4j.logger.org.apache.spark=DEBUG log4j.logger.org.apache.spark.repl.Main=DEBUG log4j.logger.org.spark_project.jetty=DEBUG log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=DEBUG log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=DEBUG log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=DEBUG log4j.logger.org.apache.parquet=DEBUG log4j.logger.parquet=DEBUG - Prudhvi -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org