First thing would be that scala supports them. Then for other things someone 
might need to redesign the Spark source code to leverage modules - this could 
be a rather handy feature to have a small but very well designed core (core, 
ml, graph etc) around which others write useful modules.

> On 16. May 2018, at 08:20, xmehaut <xavier.meh...@gmail.com> wrote:
> 
> Hello,
> 
> i would like to know what coudl be the impacts of java 10+ on spark. I know
> that spark is written in scala, but the last versions of java include many
> improvements, especially in the jvm or in the delivery process (modules,
> jit, memory mngt, ...) which could benefit to spark.
> 
> regards
> 
> 
> 
> 
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to