Re: How to programmatically create, submit and report on Spark jobs?

2015-08-10 Thread Ted Yu
For monitoring, please take a look at
http://spark.apache.org/docs/latest/monitoring.html
Especially REST API section.

Cheers

On Mon, Aug 10, 2015 at 8:33 AM, Ted Yu  wrote:

> I found SPARK-3733 which was marked dup of SPARK-4924 which went to 1.4.0
>
> FYI
>
> On Mon, Aug 10, 2015 at 5:12 AM, mark  wrote:
>
>> Hi All
>>
>> I need to be able to create, submit and report on Spark jobs
>> programmatically in response to events arriving on a Kafka bus. I also need
>> end-users to be able to create data queries that launch Spark jobs 'behind
>> the scenes'.
>>
>> I would expect to use the same API for both, and be able to provide a
>> user friendly view (ie. *not *the Spark web UI) of all jobs (user and
>> system) that are currently running, have completed, failed etc.
>>
>> Are there any tools / add-ons for this? Or is there a suggested approach?
>>
>> Thanks
>>
>
>


Re: How to programmatically create, submit and report on Spark jobs?

2015-08-10 Thread Ted Yu
I found SPARK-3733 which was marked dup of SPARK-4924 which went to 1.4.0

FYI

On Mon, Aug 10, 2015 at 5:12 AM, mark  wrote:

> Hi All
>
> I need to be able to create, submit and report on Spark jobs
> programmatically in response to events arriving on a Kafka bus. I also need
> end-users to be able to create data queries that launch Spark jobs 'behind
> the scenes'.
>
> I would expect to use the same API for both, and be able to provide a user
> friendly view (ie. *not *the Spark web UI) of all jobs (user and system)
> that are currently running, have completed, failed etc.
>
> Are there any tools / add-ons for this? Or is there a suggested approach?
>
> Thanks
>


How to programmatically create, submit and report on Spark jobs?

2015-08-10 Thread mark
Hi All

I need to be able to create, submit and report on Spark jobs
programmatically in response to events arriving on a Kafka bus. I also need
end-users to be able to create data queries that launch Spark jobs 'behind
the scenes'.

I would expect to use the same API for both, and be able to provide a user
friendly view (ie. *not *the Spark web UI) of all jobs (user and system)
that are currently running, have completed, failed etc.

Are there any tools / add-ons for this? Or is there a suggested approach?

Thanks