Github user gengliangwang commented on the issue:
https://github.com/apache/spark/pull/23049
@vanzin I see your point. I will add a link to
https://spark.apache.org/docs/latest/configuration.html. Thanks for the
suggestion.
In my case, I didn't know where to find or edit `spark-env.sh` at that
time. I tried run `find . -name spark-env.sh` and got nothing.
The script will only print `spark-env.sh` without its location only if
SPARK_ENV_LOADED is not set.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]