Github user zjffdu commented on the issue:
    @Leemoonsoo Finally got the unit test passed (The remaining one failure is 
    Actually the test failure is caused by several bugs of `SparkInterpreter`.  
Here's the steps to reproduce the first issue.
    1. Use Spark2.0 and make SparkInterpeter as scoped
    2. Open note1 and run sample code  to start `SparkInterperer`, then Open 
note2 and run sample code to start another `SparkInterpreter`, then you will 
hit the following issue.
    The root cause of this issue is that the `outputDir` should be unique 
otherwise the second `SparkInterpreter` instance can not find the class in the 
`outputDir` of previous `SparkInterpeter`.
    The second bug is that we should also set `sparkSession` as null. Otherwise 
it won't be created in the next second `SparkInterperter`. 
    The  third bug is that we should disable `HiveContext` in 
`AbstractTestRestApi`, otherwise we will hit the issue of multiple derby 
instances running.   

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

Reply via email to