We're using Spark 2.4. We recently pushed to production a product that's
using Spark Structured Streaming. It's working well most of the time but
occasionally, when the load is high, we've noticed that there are only 10+
'Active Tasks' even though we've provided 128 cores. Would like to debug
this further. Why are all the Cores not getting used? How do we debug this?
Please help. Thanks.

Reply via email to