Hello,

     So far I think there are at two ways (maybe more) to interact
from various programming languages with the Spark Core: PySpark API
and R API. From reading code it seems that PySpark approach and R
approach are very disparate ... with the latter using the R-Java
bridge. Vis-a-vis/regarding I am trying to decide Haskell which way to
go. I realize that like any open software effort that approaches
varied based on history. Is there an intent to adopt one approach as
standard?(Not trying to start a war :-) :-(.

Vasili

BTW I guess Java and Scala APIs are simple given the nature of both
languages vis-a-vis the JVM??

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to