Github user ueshin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20089#discussion_r158789947
  
    --- Diff: python/README.md ---
    @@ -29,4 +29,4 @@ The Python packaging for Spark is not intended to replace 
all of the other use c
     
     ## Python Requirements
     
    -At its core PySpark depends on Py4J (currently version 0.10.6), but 
additional sub-packages have their own requirements (including numpy and 
pandas).
    +At its core PySpark depends on Py4J (currently version 0.10.6), but 
additional sub-packages have their own requirements (including numpy, pandas, 
and pyarrow).
    --- End diff --
    
    I added some more details. WDYT?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to