We're using Spark 2.4. We recently pushed to production a product that's using Spark Structured Streaming. It's working well most of the time but occasionally, when the load is high, we've noticed that there are only 10+ 'Active Tasks' even though we've provided 128 cores. Would like to debug this further. Why are all the Cores not getting used? How do we debug this? Please help. Thanks.
- Debugging tools for Spark Structured Streaming Eric Beabes
- Re: Debugging tools for Spark Structured Streaming Artemis User