Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/20089#discussion_r158797489
--- Diff: python/README.md ---
@@ -29,4 +29,4 @@ The Python packaging for Spark is not intended to replace
all of the other use c
## Python Requirements
-At its core PySpark depends on Py4J (currently version 0.10.6), but
additional sub-packages have their own requirements (including numpy and
pandas).
+At its core PySpark depends on Py4J (currently version 0.10.6), but
additional sub-packages might have their own requirements declared as "Extras"
(including numpy, pandas, and pyarrow). You can install the requirements by
specifying their extra names.
--- End diff --
Not a big deal anyway. I am actually fine as is too if you prefer @ueshin.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]