Maybe something is wrong in my app too?
Thanks for your help,
NM
View this message in context: [Spark Streaming] Disk not being cleaned up
during runtime after RDD being processed
Sent from the Apache Spark User List mailing list archive at Nabble.com
] Disk not being cleaned up
during runtime after RDD being processed
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Disk-not-being-cleaned-up-during-runtime-after-RDD-being-processed-tp22271.html
Sent from the Apache Spark User List mailing list archive
http://apache-spark-user
is wrong in my app too?
Thanks for your help,
NM
--
View this message in context: [Spark Streaming] Disk not being cleaned
up during runtime after RDD being processed
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Disk-not-being-cleaned-up
/K86LE1J6
Maybe something is wrong in my app too?
Thanks for your help,
NM
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Disk-not-being-cleaned-up-during-runtime-after-RDD-being-processed-tp22271.html
Sent from the Apache Spark User List
for your help,
NM
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Disk-not-being-cleaned-up-during-runtime-after-RDD-being-processed-tp22240.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Hi,
I’ve been trying to use Spark Streaming for my real-time analysis
application using the Kafka Stream API on a cluster (using the yarn
version) of 6 executors with 4 dedicated cores and 8192mb of dedicated RAM.
The thing is, my application should run 24/7 but the disk usage is leaking.
This