itholic commented on code in PR #40324: URL: https://github.com/apache/spark/pull/40324#discussion_r1130421548
########## docs/index.md: ########## @@ -49,8 +49,19 @@ For Java 11, `-Dio.netty.tryReflectionSetAccessible=true` is required additional # Running the Examples and Shell -Spark comes with several sample programs. Scala, Java, Python and R examples are in the -`examples/src/main` directory. To run one of the Java or Scala sample programs, use +Spark comes with several sample programs. Python, Scala, Java and R examples are in the +`examples/src/main` directory. + +To run Spark interactively in a Python interpreter, use +`bin/pyspark`: + + ./bin/pyspark --master local[2] Review Comment: Yeah, I think we can just update this to `./bin/pyspark --master local` or just simply `./bin/pyspark` while we're here :-) Both way looks fine to me if it's working properly (I checked both work fine in my local env) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org