Hi Gavin,

I believe in standalone mode a simple cluster manager is included with
Spark that makes it easy to set up a cluster. It does not rely on YARN or
Mesos.

In summary this is from my notes:


   -

   Spark Local - Spark runs on the local host. This is the simplest set up
   and best suited for learners who want to understand different concepts of
   Spark and those performing unit testing.
   -

   Spark Standalone – a simple cluster manager included with Spark that
   makes it easy to set up a cluster.
   -

   YARN Cluster Mode, the Spark driver runs inside an application master
   process which is managed by YARN on the cluster, and the client can go away
   after initiating the application.
   -

   Mesos. I have not used it so cannot comment

YARN Client Mode, the driver runs in the client process, and the
application master is only used for requesting resources from YARN. Unlike
Local or Spark standalone modes, in which the master’s address is specified
in the --master parameter, in YARN mode the ResourceManager’s address is
picked up from the Hadoop configuration. Thus, the --master parameter is
yarn

HTH




Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 11 June 2016 at 22:26, Gavin Yue <yue.yuany...@gmail.com> wrote:

> The standalone mode is against Yarn mode or Mesos mode, which means spark
> uses Yarn or Mesos as cluster managements.
>
> Local mode is actually a standalone mode which everything runs on the
> single local machine instead of remote clusters.
>
> That is my understanding.
>
>
> On Sat, Jun 11, 2016 at 12:40 PM, Ashok Kumar <
> ashok34...@yahoo.com.invalid> wrote:
>
>> Thank you for grateful
>>
>> I know I can start spark-shell by launching the shell itself
>>
>> spark-shell
>>
>> Now I know that in standalone mode I can also connect to master
>>
>> spark-shell --master spark://<HOST>:7077
>>
>> My point is what are the differences between these two start-up modes for
>> spark-shell? If I start spark-shell and connect to master what performance
>> gain will I get if any or it does not matter. Is it the same as for 
>> spark-submit
>>
>>
>>
>> regards
>>
>>
>> On Saturday, 11 June 2016, 19:39, Mohammad Tariq <donta...@gmail.com>
>> wrote:
>>
>>
>> Hi Ashok,
>>
>> In local mode all the processes run inside a single jvm, whereas in
>> standalone mode we have separate master and worker processes running in
>> their own jvms.
>>
>> To quickly test your code from within your IDE you could probable use the
>> local mode. However, to get a real feel of how Spark operates I would
>> suggest you to have a standalone setup as well. It's just the matter
>> of launching a standalone cluster either manually(by starting a master and
>> workers by hand), or by using the launch scripts provided with Spark
>> package.
>>
>> You can find more on this *here*
>> <http://spark.apache.org/docs/latest/spark-standalone.html>.
>>
>> HTH
>>
>>
>>
>> [image: http://]
>>
>> Tariq, Mohammad
>> about.me/mti
>> [image: http://]
>> <http://about.me/mti>
>>
>>
>> On Sat, Jun 11, 2016 at 11:38 PM, Ashok Kumar <
>> ashok34...@yahoo.com.invalid> wrote:
>>
>> Hi,
>>
>> What is the difference between running Spark in Local mode or standalone
>> mode?
>>
>> Are they the same. If they are not which is best suited for non prod work.
>>
>> I am also aware that one can run Spark in Yarn mode as well.
>>
>> Thanks
>>
>>
>>
>>
>>
>

Reply via email to