Very useful indeed, Rick!

If I have two zeppelin instances running as two different users with same
Spark Master -  I see them as two different applications in Spark Web UI.

1. will they have their own 'context' of execution in this case? If I
understand, this would mean that closing a spark context in one user's
zeppelin will have no impact on another user's zeppelin environment or its
not true?

On Thu, Sep 24, 2015 at 4:47 PM, Rick Moritz <rah...@gmail.com> wrote:

> 1)
> Zeppelin uses the spark-shell REPL API. Therefore it behaves similarly to
> the scala shell.
> You do not write applications in the shell, in the technical sense, but
> instead evaluate individual expressions with the goal of interacting with a
> dataset.
> You can (manually) export some of the code that you find useful in
> Zeppelin to applications, for example to provide batch-pre-processing.
> I recommend you look at demos/descriptions of the interactive shell
> functionality to get an idea, of what Zeppelin offers over an application.
> Also: You still have to manage most of your imports ;)
>
> 2)
> There are two benefits:
> - You can import and export/share notebooks. This means it makes sense to
> split content.
> - You also reduce the load of the browser, by splitting heavy
> visualizations into multiple notebooks. Once you start rendering tens of
> thousands of points, you start reaching the limits of a browser's
> capability.
>
> Hopefully this helps you get started.
>
> On Thu, Sep 24, 2015 at 1:04 PM, Hammad <ham...@flexilogix.com> wrote:
>
>> Hi mates,
>>
>> I was struggling with anatomy of Zeppelin in context of Spark and could
>> not find anywhere that could answer my questions in mind as below;
>>
>> 1. Usually a scala application structure is;
>>
>> import org.apache.<whatever>
>>
>> obect MyApp{
>> def main(args: Array[String]){
>> //something
>> }
>> }
>>
>> whereas, on zeppelin we only write //something. Does it mean that one
>> zeppelin daemon is one application? What if I want to write multiple
>> applications on one zeppelin daemon instance?
>>
>> 2. Related to (1), if same spark context is shared across all notebooks,
>> whats the benefit of having multiple notebooks?
>>
>> I really appreciate if someone may help me understand above two.
>>
>> Thanks,
>> Hmad
>>
>
>


-- 
Flexilogix
Ph: +92 618090374
Fax: +92 612011810
http://www.flexilogix.com
i...@flexilogix.com

Disclaimer:  This transmission (including any attachments) may contain
confidential information, privileged material or constitute non-public
information. Any use of this information by anyone other than the intended
recipient is prohibited. If you have received this transmission in error,
please immediately reply to the sender and delete this information from
your system.

Reply via email to