be aware that older java 8 versions count the #of cores in the host, not
those allocated for the container they run in
https://bugs.openjdk.java.net/browse/JDK-8140793

On Tue, Jun 18, 2019 at 8:13 PM Ilya Matiach <il...@microsoft.com.invalid>
wrote:

> Hi Andrew,
>
> I tried to do something similar to that in the LightGBM
> classifier/regressor/ranker in mmlspark package, I try to use the spark
> conf and if not configured I get the processors from the JVM directly:
>
>
> https://github.com/Azure/mmlspark/blob/master/src/lightgbm/src/main/scala/LightGBMUtils.scala#L172
>
>
>
> If you know of a better way, please let me know!
>
>
>
>     val spark = dataset.sparkSession
>
>     try {
>
>       val confCores = spark.sparkContext.getConf
>
>         .get("spark.executor.cores").toInt
>
>       val confTaskCpus = spark.sparkContext.getConf
>
>         .get("spark.task.cpus", "1").toInt
>
>       confCores / confTaskCpus
>
>     } catch {
>
>       case _: NoSuchElementException =>
>
>         // If spark.executor.cores is not defined, get the cores per JVM
>
>         import spark.implicits._
>
>         val numMachineCores = spark.range(0, 1)
>
>           .map(_ =>
> java.lang.Runtime.getRuntime.availableProcessors).collect.head
>
>         numMachineCores
>
>     }
>
>
>
> Thank you, Ilya
>
>
>
> *From:* Andrew Melo <andrew.m...@gmail.com>
> *Sent:* Tuesday, June 18, 2019 11:32 AM
> *To:* dev <dev@spark.apache.org>
> *Subject:* Detect executor core count
>
>
>
> Hello,
>
>
>
> Is there a way to detect the number of cores allocated for an executor
> within a java-based InputPartitionReader?
>
>
>
> Thanks!
>
> Andrew
>

Reply via email to