On Wed, Jun 8, 2016 at 3:42 AM, Jacek Laskowski wrote:
> On Wed, Jun 8, 2016 at 2:38 AM, Mohit Anchlia
> wrote:
> > I am looking to write an ETL job using spark that reads data from the
> > source, perform transformation and insert it into the
On Wed, Jun 8, 2016 at 2:38 AM, Mohit Anchlia wrote:
> I am looking to write an ETL job using spark that reads data from the
> source, perform transformation and insert it into the destination.
Is this going to be one-time job or you want it to run every time interval?
>
I am looking to write an ETL job using spark that reads data from the
source, perform transformation and insert it into the destination. I am
trying to understand how spark deals with failures? I can't seem to find
the documentation. I am interested in learning the following scenarios:
1. Source