Cool. I was thinking of waiting a second and doing ps aux | grep java | grep 
jarname.jar, and I guess checking 4040 would work as as well. Thanks for the 
input.
Regards,Ashic.

Date: Sat, 24 Jan 2015 13:00:13 +0530
Subject: Re: Starting a spark streaming app in init.d
From: ak...@sigmoidanalytics.com
To: as...@live.com
CC: user@spark.apache.org

I'd do the same but put an extra condition to check whether the job has 
successfully started or not by checking the application ui (port availability 
4040 would do, if you want more complex one then write a parser for the same.) 
after putting the main script on sleep for some time (say 2 minutes).ThanksBest 
Regards

On Sat, Jan 24, 2015 at 1:57 AM, Ashic Mahtab <as...@live.com> wrote:



Hello,
I'm trying to kick off a spark streaming job to a stand alone master using 
spark submit inside of init.d. This is what I have:


DAEMON="spark-submit --class Streamer --executor-memory 500M 
--total-executor-cores 4 /path/to/assembly.jar"

start() {
        $DAEMON -p /var/run/my_assembly.pid &
        echo "OK" &&
        return 0
}

However, will return 0 even if spark_submit fails. Is there a way to run 
spark-submit in the background and return 0 only if it successfully starts up? 
Or better yet, is there something in spark-submit that will allow me to do 
this, perhaps via a command line argument?

Thanks,
Ashic.
                                          

                                          

Reply via email to