Github user ueshin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20089#discussion_r158799303
  
    --- Diff: python/README.md ---
    @@ -29,4 +29,4 @@ The Python packaging for Spark is not intended to replace 
all of the other use c
     
     ## Python Requirements
     
    -At its core PySpark depends on Py4J (currently version 0.10.6), but 
additional sub-packages have their own requirements (including numpy and 
pandas).
    +At its core PySpark depends on Py4J (currently version 0.10.6), but 
additional sub-packages might have their own requirements declared as "Extras" 
(including numpy, pandas, and pyarrow). You can install the requirements by 
specifying their extra names.
    --- End diff --
    
    Let's use the simple one you suggested and leave the detailed description 
for the future prs.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to