HyukjinKwon commented on a change in pull request #34134:
URL: https://github.com/apache/spark/pull/34134#discussion_r718109786
##########
File path: docs/rdd-programming-guide.md
##########
@@ -241,12 +241,12 @@ For a complete list of options, run `spark-shell --help`.
Behind the scenes,
In the PySpark shell, a special interpreter-aware SparkContext is already
created for you, in the
variable called `sc`. Making your own SparkContext will not work. You can set
which master the
context connects to using the `--master` argument, and you can add Python
.zip, .egg or .py files
-to the runtime path by passing a comma-separated list to `--py-files`. You can
also add dependencies
+to the runtime path by passing a comma-separated list to `--py-files`. For
thrid-party Python dependencies,
Review comment:
oops, thanks man
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]