Github user zjffdu commented on the issue:

    https://github.com/apache/zeppelin/pull/1404
  
    @Leemoonsoo Finally got the unit test passed (The remaining one failure is 
irrelevant).
    
    Actually the test failure is caused by several bugs of `SparkInterpreter`.  
Here's the steps to reproduce the first issue.
    1. Use Spark2.0 and make SparkInterpeter as scoped
    2. Open note1 and run sample code  to start `SparkInterperer`, then Open 
note2 and run sample code to start another `SparkInterpreter`, then you will 
hit the following issue.
     
    
![image](https://cloud.githubusercontent.com/assets/164491/18614389/4887684c-7dc0-11e6-9898-18fa8274be6d.png)
    
    The root cause of this issue is that the `outputDir` should be unique 
otherwise the second `SparkInterpreter` instance can not find the class in the 
`outputDir` of previous `SparkInterpeter`.
    
    The second bug is that we should also set `sparkSession` as null. Otherwise 
it won't be created in the next second `SparkInterperter`. 
    
    The  third bug is that we should disable `HiveContext` in 
`AbstractTestRestApi`, otherwise we will hit the issue of multiple derby 
instances running.   



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to