Re: Contributiona nd choice of langauge

2015-07-14 Thread srinivasraghavansr71
I saw the contribution sections. As a new contibutor, should I try to build
patches or can I add some new algorithm to MLlib. I am comfortable with
python and R. Are they enough to contribute for spark?  



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Contributiona-nd-choice-of-langauge-tp13179p13209.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Contributiona nd choice of langauge

2015-07-13 Thread srinivasraghavansr71
Hello everyone,
   I am interested to contribute to apache spark. I am more
inclined towards algorithms and computational methods for matrices,etc. I
took one course in edx where spark was taught through python interface. So
My doubts are as follows:

1. Place from where I can start working.
2. Language for coding - Is using python okay, or Is there any fixed
language I should Use? 



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Contributiona-nd-choice-of-langauge-tp13179.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Contribution

2015-06-12 Thread srinivasraghavansr71
Hi everyone,
 I am interest to contribute new algorithms and optimize
existing algorithms in the area of graph algorithms and machine learning.
Please give me some ideas where to start. Is it possible for me to introduce
the notion of neural network in the apache spark



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Contribution-tp12739.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org