Hi,
I'm hearing a common theme running that I should only do serious programming
in Scala on Spark (1.5.1). Real power users use Scala. It is said that
Python is great for analytics but in the end the code should be written to
Scala to finalise. There are a number of reasons I'm hearing:

1. Spark is written in Scala so will always be faster than any other
language implementation on top of it.
2. Spark releases always favour more features being visible and enabled for
Scala API than Python API.

Are there any truth's to the above? I'm a little sceptical.

Thanks
Dan



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Does-feature-parity-exist-between-Scala-and-Python-on-Spark-tp24961.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to