Github user zjffdu commented on the issue:

    https://github.com/apache/zeppelin/pull/2709
  
    @Leemoonsoo binary package is the same as before, spark/scala-2.11 and 
spark/scala-2.10 will be packaged together into 
`spark-interpreter-0.8.0-SNAPSHOT.jar` which doesn't contain scala library. 
scala library is in the spark distribution. So SparkInterpreter will load the 
correct scala version of SparkInterpreter (`SparkScala211Interpreter` or 
`SparkScala210Interpreter`) based the scala version in your spark distribution 
(depends on SPARK_HOME you set). 


---

Reply via email to