I'm not sure how many people could even guess possible reasons - I'd say there's not enough information. No driver/executor logs, no job/stage/executor information, no code.
On Thu, Jan 21, 2021 at 8:25 PM Jacek Laskowski <ja...@japila.pl> wrote: > Hi, > > I'd look at stages and jobs as it's possible that the only task running is > the missing one in a stage of a job. Just guessing... > > Pozdrawiam, > Jacek Laskowski > ---- > https://about.me/JacekLaskowski > "The Internals Of" Online Books <https://books.japila.pl/> > Follow me on https://twitter.com/jaceklaskowski > > <https://twitter.com/jaceklaskowski> > > > On Thu, Jan 21, 2021 at 12:19 PM Eric Beabes <mailinglist...@gmail.com> > wrote: > >> Hello, >> >> My Spark Structured Streaming application was performing well for quite >> some time but all of a sudden from today it has slowed down. I noticed in >> the Spark UI that the 'No. of Active Tasks' is 1 even though 64 Cores are >> available. (Please see the attached image). >> >> I don't believe there's any data skew issue related to partitioning of >> data. What could be the reason for this? Please advise. Thanks. >> >> >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >