Here is a refactored version of the question:

How to run spark-class for long running applications? Why is that
spark-class doesn't launch a daemon?


On Wed, Jan 8, 2014 at 3:21 AM, Aureliano Buendia <[email protected]>wrote:

> Hi,
>
> The EC2 
> documents<http://spark.incubator.apache.org/docs/0.8.1/ec2-scripts.html>has a 
> section called 'Running Applications', but it actually lacks the step
> which should describe how to run the application.
>
> The spark_ec2 
> script<https://github.com/apache/incubator-spark/blob/59e8009b8d5e51b6f776720de8c9ecb09e1072dc/ec2/spark_ec2.py>seems
>  to set up a standalone cluster, although I'm not sure why AMI_PREFIX
> point to mesos ami 
> list<https://github.com/apache/incubator-spark/blob/59e8009b8d5e51b6f776720de8c9ecb09e1072dc/ec2/spark_ec2.py#L44>
> .
>
> Assuming that the cluster type is standalone, we could run the app by
> spark-class script. Is this the missing step in the documentations?
>
> spark-class script does not launch a daemon, is it suppose to be used with
> nohup for long running applications?
>
> Finally, is the standalone cluster type used for real world applications,
> or do people use spark on yarn and mesos when it comes to production?
>

Reply via email to