So, Can I increase the number of threads by manually coding in the Spark
code?

On Sat, Feb 7, 2015 at 6:52 PM, Sean Owen <so...@cloudera.com> wrote:

> If you look at the threads, the other 30 are almost surely not Spark
> worker threads. They're the JVM finalizer, GC threads, Jetty
> listeners, etc. Nothing wrong with this. Your OS has hundreds of
> threads running now, most of which are idle, and up to 4 of which can
> be executing.  In a one-machine cluster, I don't think you would
> expect any difference in number of running threads. More data does not
> mean more threads, no. Your executor probably takes as many threads as
> cores in both cases, 4.
>
>
> On Sat, Feb 7, 2015 at 10:14 AM, Deep Pradhan <pradhandeep1...@gmail.com>
> wrote:
> > Hi,
> > I am using YourKit tool to profile Spark jobs that is run in my Single
> Node
> > Spark Cluster.
> > When I see the YourKit UI Performance Charts, the thread count always
> > remains at
> > All threads: 34
> > Daemon threads: 32
> >
> > Here are my questions:
> >
> > 1. My system can run only 4 threads simultaneously, and obviously my
> system
> > does not have 34 threads. What could 34 threads mean?
> >
> > 2. I tried running the same job with four different datasets, two small
> and
> > two relatively big. But in the UI the thread count increases by two,
> > irrespective of data size. Does this mean that the number of threads
> > allocated to each job depending on data size is not taken care by the
> > framework?
> >
> > Thank You
>

Reply via email to