MOBIN-F commented on a change in pull request #3523: 
[ZEPPELIN-4326][spark]Spark Interpreter restart failed when no suffic…
URL: https://github.com/apache/zeppelin/pull/3523#discussion_r348349368
 
 

 ##########
 File path: 
zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java
 ##########
 @@ -32,6 +34,7 @@
     implements WrappedInterpreter {
   private Interpreter intp;
   volatile boolean opened = false;
+  private Lock lock = new ReentrantLock();
 
 Review comment:
   [create sparkContext thread] and [restart sparkInterpreter thread] compete 
for the same synchronized lock. If the SparkContext is not created, the spark 
interpreter will restart.
   1. The phenomenon that the restart intereperter interface is stuck
   2. On the basis of step 1, the user may kill -9 SparkSubmit process. At this 
time, the synchronized lock is released. [restart sparkInterpreter thread] 
acquires the lock and re-creates the sparkContext. **The phenomenon of 
re-appearing a spark app after killied a spark app**
      Because getSparkInterpreter() and LazyOpenInterpreter.cancel will be 
called to LazyOpenInterpreter.open(),I used lock.tryLock() instead of 
synchronized
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to