Hello, 1) I have been rereading kind email responses to my Spark queries. Thx.
2) I have also been reading "R" code: 1) RDD.R 2) DataFrame.R 3) All following API's => https://cwiki.apache.org/confluence/display/SPARK/Spark+Internals 4) Python ... https://spark.apache.org/docs/latest/programming-guide.html Based on the above points, when I see in e.g. DataFrame.R, calls to a function "callJMethod" ... two questions ... - is is callJMethod calling into the JVM or better yet the Spark Java API? Please answer in a more mathematical way without ambiguities ... this will save too many back and forth queries .... - if the above is true, how is the JVM "imported" into the DataFrame.R code. I would appreciate if you answers would be interleaved with my questions to avoid ambiguities .. Thx, Thx, Vasili --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org