Hello Community,
I have what I hope to be a couple of quick questions regarding port control
on Spark which is Yarn-aware (cluster & client modes). I'm aware that I can
control port configurations by setting driver.port, executor.port, etc to
use specified ports, but I'm not sure how/if that correlates to Spark when
it's executed in Yarn-Cluster mode and/or Yarn-Client mode.
Questions I have are:
1) How does the spark.yarn.am.port relate to defined ports within Spark
(driver, executor, block manager, etc.)?
2) Doe the spark.yarn.am.port parameter only relate to the spark
driver.port?
3) Is the spark.yarn.am.port applicable to Yarn-Cluster or Yarn-Client modes
or both?
Ultimately, I'm trying to remove a lot of the randomness of ports to avoid
potential conflicts. This may either be controlled via a specified port..
or range of ports.
Cheers,
Grant
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Port-Control-for-YARN-Aware-Spark-tp25458.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org