Re: Running Spark Job in Background

2018-08-13 Thread Ilya Kasnacheev
Hello! You can invoke `disown' after launching process with &. Note that & and nohup are very different, it is very strange if the result is the same. Nohup jobs don't even use same terminal. Regards, -- Ilya Kasnacheev 2018-08-13 14:02 GMT+03:00 ApacheUser : > Thanks Denis, > > When Submit

Re: Running Spark Job in Background

2018-08-13 Thread ApacheUser
Thanks Denis, When Submit Spark job which connects to Ignite cluster creates an Ignite Client. The Ignite Client gets disconnected whe I close the window(Linux Shell). Regular Spark jobs are running fine with & or nohup, but in Spark/Ignite case, the clienst ae getting killed and spark job

Re: Running Spark Job in Background

2018-08-13 Thread Denis Mekhanikov
This is not really an Ignite question. Try asking it on Spark userlist: http://apache-spark-user-list.1001560.n3.nabble.com/ Running commands with & is a valid approach though. You can also try using nohup . Denis вс, 12 авг. 2018 г. в 5:12, ApacheUser : >

Running Spark Job in Background

2018-08-11 Thread ApacheUser
Hello Ignite Team, I have Spark job thats streams live data into Ignite Cache . The job gets closed as soon as I close window(Linux shell) . The other spark streaming jobs I run with "&" at the end of spark submit job and they run for very long time untill they I stop or crash due to other