Hi User Team,
I'm trying to schedule resource in spark 2.1.0 using below code but still all 
the cpu cores are captured by only single spark application and hence no other 
application is starting. Could you please help me out:
sqlContext = 
SparkSession.builder.master("spark://172.26.7.192:7077").config("spark.sql.warehouse.dir",
 "/tmp/PM/").config("spark.sql.shuffle.partitions", 
"6").config("spark.cores.max", "5").config("spark.executor.cores", 
"2").config("spark.driver.memory", "8g").config("spark.executor.memory", 
"4g").appName(APP_NAME).getOrCreate()


Thanks & Best Regards,
Engr. Palash GuptaWhatsApp/Viber: +8801817181502Skype: palash2494

 

Thanks & Best Regards,Palash Gupta

Reply via email to