Hello,

I am currently taking a course in Apache Spark via EdX (
https://www.edx.org/course/introduction-big-data-apache-spark-uc-berkeleyx-cs100-1x)
and at the same time I try to look at the code for pySpark too. I wanted to
ask, if ideally I would like to contribute to pyspark specifically, how can
I do that? I do not intend to contribute to core Apache Spark any time soon
(mainly because I do not know Scala) but I am very comfortable in Python.

Any tips on how to contribute specifically to pyspark without being
affected by other parts of Spark would be greatly appreciated.

P.S.: I ask this because there is a small change/improvement I would like
to propose. Also since I just started learning Spark, I would like to also
read and understand the pyspark code as I learn about Spark. :)

Hope to hear from you soon.

Usman Ehtesham Gul
https://github.com/ueg1990

Reply via email to