Anyone could shed some light on this?

Thanks,
Boric

On Tue, Jan 19, 2016 at 4:12 PM, Boric Tan <it.news.tre...@gmail.com> wrote:

> Hi there,
>
> I am new to Spark, and would like to get some help to understand if Spark
> can utilize the underlying architectures for better performance. If so, how
> does it do it?
>
> For example, assume there is a cluster built with machines of different
> CPUs, will Spark check the individual CPU information and use some
> machine-specific setting for the tasks assigned to that machine? Or is it
> totally dependent on the underlying JVM implementation to run the JAR file,
> and therefor the JVM is the place to check if certain CPU features can be
> used?
>
> Thanks,
> Boric
>

Reply via email to