thanks. On Tuesday, July 14, 2015, Shivaram Venkataraman <[email protected]> wrote:
> Both SparkR and the PySpark API call into the JVM Spark API (i.e. > JavaSparkContext, JavaRDD etc.). They use different methods (Py4J vs. the > R-Java bridge) to call into the JVM based on libraries available / features > supported in each language. So for Haskell, one would need to see what is > the best way to call the underlying Java API functions from Haskell and get > results back. > > Thanks > Shivaram > > On Mon, Jul 13, 2015 at 8:51 PM, Vasili I. Galchin <[email protected] > <javascript:_e(%7B%7D,'cvml','[email protected]');>> wrote: > >> Hello, >> >> So far I think there are at two ways (maybe more) to interact >> from various programming languages with the Spark Core: PySpark API >> and R API. From reading code it seems that PySpark approach and R >> approach are very disparate ... with the latter using the R-Java >> bridge. Vis-a-vis/regarding I am trying to decide Haskell which way to >> go. I realize that like any open software effort that approaches >> varied based on history. Is there an intent to adopt one approach as >> standard?(Not trying to start a war :-) :-(. >> >> Vasili >> >> BTW I guess Java and Scala APIs are simple given the nature of both >> languages vis-a-vis the JVM?? >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: [email protected] >> <javascript:_e(%7B%7D,'cvml','[email protected]');> >> For additional commands, e-mail: [email protected] >> <javascript:_e(%7B%7D,'cvml','[email protected]');> >> >> >
