Hello, Paul 

Thank you so much for prompt reply. Our understanding is each interpreter takes 
one application on YARN, if multiple notebooks share same interpreter will run 
through same YARN application. 

As you mentioned this “… which will kill any other YARN application associated 
with that interpreter.” , we seem never see more than one YARN application in 
YARN UI, the case we had, I use one interpreter but wrote many notebooks, but 
no matter how many notebooks spark jobs, I only see one YARN application. So I 
am curious how to make single interpreter being associated with more than YARN 
applications.


Thanks


AL





> On Jan 25, 2017, at 5:39 PM, Paul Brenner <pbren...@placeiq.com> wrote:
> 
> 
> Alec,
> 
> The way we use zeppelin at our company is to set our interpreters to 
> “isolated”. That way each notebook gets it’s own application on yarn. 
> 
> This mostly works well. The one downside is that if you stop a notebook (e.g. 
> by calling sys.exit in a cell of a spark notebook) it does stop the YARN 
> application most of the time but you can’t restart that notebook until you 
> have restarted the interpreter… which will kill any other YARN application 
> associated with that interpreter.
> 
> So our full setup is that we give each user an interpreter (which is good 
> because we can set each user’s interpreter to have their username in 
> spark.yarn.queue) and set each user’s interpreter to isolated.
> 
> Honestly I still don’t understand what scoped does… maybe that would work as 
> well?
> 
>  <http://www.placeiq.com/> <http://www.placeiq.com/> 
> <http://www.placeiq.com/>        Paul Brenner     
> <https://twitter.com/placeiq> <https://twitter.com/placeiq> 
> <https://twitter.com/placeiq>       <https://www.facebook.com/PlaceIQ> 
> <https://www.facebook.com/PlaceIQ>   
> <https://www.linkedin.com/company/placeiq> 
> <https://www.linkedin.com/company/placeiq>
> DATA SCIENTIST
> (217) 390-3033  
> 
>  
> <http://www.placeiq.com/2015/05/26/placeiq-named-winner-of-prestigious-2015-oracle-data-cloud-activate-award/>
>  
> <http://placeiq.com/2015/12/18/accuracy-vs-precision-in-location-data-mma-webinar/>
>  
> <http://placeiq.com/2015/12/18/accuracy-vs-precision-in-location-data-mma-webinar/>
>  
> <http://placeiq.com/2015/12/18/accuracy-vs-precision-in-location-data-mma-webinar/>
>  
> <http://placeiq.com/2015/12/18/accuracy-vs-precision-in-location-data-mma-webinar/>
>  
> <http://placeiq.com/2016/03/08/measuring-addressable-tv-campaigns-is-now-possible/>
>  
> <http://placeiq.com/2016/04/13/placeiq-joins-the-network-advertising-initiative-nai-as-100th-member/>
>  
> <http://placeiq.com/2016/04/13/placeiq-joins-the-network-advertising-initiative-nai-as-100th-member/>
>  
> <http://placeiq.com/2016/04/13/placeiq-joins-the-network-advertising-initiative-nai-as-100th-member/>
>  
> <http://placeiq.com/2016/04/13/placeiq-joins-the-network-advertising-initiative-nai-as-100th-member/>
>  
> <http://placeiq.com/2016/04/13/placeiq-joins-the-network-advertising-initiative-nai-as-100th-member/>
>  
> <http://pages.placeiq.com/Location-Data-Accuracy-Whitepaper-Download.html?utm_source=Signature&utm_medium=Email&utm_campaign=AccuracyWP>
>  
> <http://placeiq.com/2016/08/03/placeiq-bolsters-location-intelligence-platform-with-mastercard-insights/>
>  
> <http://placeiq.com/2016/10/26/the-making-of-a-location-data-industry-milestone/>
>  
> <http://placeiq.com/2016/12/07/placeiq-introduces-landmark-a-groundbreaking-offering-that-delivers-access-to-the-highest-quality-location-data-for-insights-that-fuel-limitless-business-decisions/>
> 
> On Wed, Jan 25, 2017 at 8:20 PM Alec Lee <Alec Lee  <mailto:Alec Lee 
> <alec.in...@gmail.com>>> wrote:
> Hi, all 
> 
> 
> Currently we are exploring feature of zeppelin, now the situation we are 
> using YARN to manage spark jobs. In terms of the experiments, we conclude 
> that one interpreter is corresponding to An application in YARN cluster, that 
> means all the notebooks from zeppelin with same interpreter go through single 
> Application in YARN. Also we found if out code shuts down the application in 
> YARN, then any notebooks fail to run after this point - error like this 
> “can’t call a stop spark context …..” . The only solution for this is to 
> restart the interpreter. How to get around this without restart the 
> interpreter? 
> 
> 
> 
> Thanks 
> 
> 
> AL
> 

Reply via email to