A pretty large fraction of users use Java, but a few features are still not 
available in it. JdbcRDD is one of them -- this functionality will likely be 
superseded by Spark SQL when we add JDBC as a data source. In the meantime, to 
use it, I'd recommend writing a class in Scala that has Java-friendly methods 
and getting an RDD to it from that. Basically the two parameters that weren't 
friendly there were the ClassTag and the getConnection and mapRow functions.

Subclassing RDD in Java is also not really supported, because that's an 
internal API. We don't expect users to be defining their own RDDs.

Matei

> On Oct 28, 2014, at 11:47 AM, critikaled <isasmani....@gmail.com> wrote:
> 
> Hi Ron,
> what ever api you have in scala you can possibly use it form java. scala is
> inter-operable with java and vice versa. scala being both object oriented
> and functional will make your job easier on jvm and it is more consise than
> java. Take it as an opportunity and start learning scala ;).
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-Spark-in-Java-a-bad-idea-tp17534p17538.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to