Thanks for the tip!

On Wed, Jan 23, 2019 at 10:28 AM Moein Hosseini <moein...@gmail.com> wrote:

> In this manner, your application should create distinct jobs each time. So
> for the first time you driver create DAG and do it with help of executors,
> then finish the job and goes to sleep( Driver/Application ). When it wakes
> up, it will create new Job and DAG and ...
> Some how same as create cron-job to submit your single application to
> cluster every time.
>
> On Wed, Jan 23, 2019 at 10:04 AM Kevin Mellott <kevin.r.mell...@gmail.com>
> wrote:
>
>> I’d recommend using a scheduler of some kind to trigger your job each
>> hour, and have the Spark job exit when it completes. Spark is not meant to
>> run in any type of “sleep mode”, unless you want to run a structured
>> streaming job and create a separate process to pull data from Casandra and
>> publish it to your streaming endpoint. That decision really depends more on
>> your use case.
>>
>> On Tue, Jan 22, 2019 at 11:56 PM Soheil Pourbafrani <
>> soheil.i...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I want to submit a job in YARN cluster to read data from Cassandra and
>>> write them in HDFS, every hour, for example.
>>>
>>> Is it possible to make Spark Application sleep in a while true loop and
>>> awake every hour to process data?
>>>
>>
>
> --
>
> Moein Hosseini
> Data Engineer
> mobile: +98 912 468 1859 <+98+912+468+1859>
> site: www.moein.xyz
> email: moein...@gmail.com
> [image: linkedin] <https://www.linkedin.com/in/moeinhm>
> [image: twitter] <https://twitter.com/moein7tl>
>
>

Reply via email to