Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread darren
We've been able to use ipopo dependency injection framework in our pyspark 
system and deploy .egg pyspark apps that resolve and wire up all the components 
(like a kernel architecture. Also similar to spring) during an initial 
bootstrap sequence; then invoke those components across spark.
Just replying for info since it's not identical to your request but in the same 
spirit.
Darren


Sent from my Verizon, Samsung Galaxy smartphone
 Original message From: Chetan Khatri 
<chetan.opensou...@gmail.com> Date: 1/4/17  6:34 AM  (GMT-05:00) To: Lars 
Albertsson <la...@mapflat.com> Cc: user <user@spark.apache.org>, Spark Dev List 
<d...@spark.apache.org> Subject: Re: Dependency Injection and Microservice 
development with Spark 
Lars,
Thank you, I want to use DI for configuring all the properties (wiring) for 
below architectural approach.
Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from 
HBase -> Hive with Transformation) -> Spark Transformation -> PostgreSQL
Thanks.
On Thu, Dec 29, 2016 at 3:25 AM, Lars Albertsson <la...@mapflat.com> wrote:
Do you really need dependency injection?



DI is often used for testing purposes. Data processing jobs are easy

to test without DI, however, due to their functional and synchronous

nature. Hence, DI is often unnecessary for testing data processing

jobs, whether they are batch or streaming jobs.



Or do you want to use DI for other reasons?





Lars Albertsson

Data engineering consultant

www.mapflat.com

https://twitter.com/lalleal

+46 70 7687109

Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com





On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri

<chetan.opensou...@gmail.com> wrote:

> Hello Community,

>

> Current approach I am using for Spark Job Development with Scala + SBT and

> Uber Jar with yml properties file to pass configuration parameters. But If i

> would like to use Dependency Injection and MicroService Development like

> Spring Boot feature in Scala then what would be the standard approach.

>

> Thanks

>

> Chetan





Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread Jiří Syrový
Hi,

another nice approach is to use instead of it Reader monad and some
framework to support this approach (e.g. Grafter -
https://github.com/zalando/grafter). It's lightweight and helps a bit with
dependencies issues.

2016-12-28 22:55 GMT+01:00 Lars Albertsson :

> Do you really need dependency injection?
>
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
>
> Or do you want to use DI for other reasons?
>
>
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com
>
>
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
>  wrote:
> > Hello Community,
> >
> > Current approach I am using for Spark Job Development with Scala + SBT
> and
> > Uber Jar with yml properties file to pass configuration parameters. But
> If i
> > would like to use Dependency Injection and MicroService Development like
> > Spring Boot feature in Scala then what would be the standard approach.
> >
> > Thanks
> >
> > Chetan
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Dependency Injection and Microservice development with Spark

2017-01-04 Thread Chetan Khatri
Lars,

Thank you, I want to use DI for configuring all the properties (wiring) for
below architectural approach.

Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from
HBase -> Hive with Transformation) -> Spark Transformation -> PostgreSQL

Thanks.

On Thu, Dec 29, 2016 at 3:25 AM, Lars Albertsson  wrote:

> Do you really need dependency injection?
>
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
>
> Or do you want to use DI for other reasons?
>
>
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com
>
>
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
>  wrote:
> > Hello Community,
> >
> > Current approach I am using for Spark Job Development with Scala + SBT
> and
> > Uber Jar with yml properties file to pass configuration parameters. But
> If i
> > would like to use Dependency Injection and MicroService Development like
> > Spring Boot feature in Scala then what would be the standard approach.
> >
> > Thanks
> >
> > Chetan
>


Re: Dependency Injection and Microservice development with Spark

2016-12-30 Thread Muthu Jayakumar
Adding to Lars Albertsson & Miguel Morales, I am hoping to see how
well scalameta would branch down into support for macros that can rid away
sizable DI problems and for the reminder having a class type as args as Miguel
Morales mentioned.

Thanks,


On Wed, Dec 28, 2016 at 6:41 PM, Miguel Morales 
wrote:

> Hi
>
> Not sure about Spring boot but trying to use DI libraries you'll run into
> serialization issues.I've had luck using an old version of Scaldi.
> Recently though I've been passing the class types as arguments with default
> values.  Then in the spark code it gets instantiated.  So you're basically
> passing and serializing a class name.
>
> Sent from my iPhone
>
> > On Dec 28, 2016, at 1:55 PM, Lars Albertsson  wrote:
> >
> > Do you really need dependency injection?
> >
> > DI is often used for testing purposes. Data processing jobs are easy
> > to test without DI, however, due to their functional and synchronous
> > nature. Hence, DI is often unnecessary for testing data processing
> > jobs, whether they are batch or streaming jobs.
> >
> > Or do you want to use DI for other reasons?
> >
> >
> > Lars Albertsson
> > Data engineering consultant
> > www.mapflat.com
> > https://twitter.com/lalleal
> > +46 70 7687109
> > Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com
> >
> >
> > On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
> >  wrote:
> >> Hello Community,
> >>
> >> Current approach I am using for Spark Job Development with Scala + SBT
> and
> >> Uber Jar with yml properties file to pass configuration parameters. But
> If i
> >> would like to use Dependency Injection and MicroService Development like
> >> Spring Boot feature in Scala then what would be the standard approach.
> >>
> >> Thanks
> >>
> >> Chetan
> >
> > -
> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Dependency Injection and Microservice development with Spark

2016-12-28 Thread Miguel Morales
Hi

Not sure about Spring boot but trying to use DI libraries you'll run into 
serialization issues.I've had luck using an old version of Scaldi.  
Recently though I've been passing the class types as arguments with default 
values.  Then in the spark code it gets instantiated.  So you're basically 
passing and serializing a class name.

Sent from my iPhone

> On Dec 28, 2016, at 1:55 PM, Lars Albertsson  wrote:
> 
> Do you really need dependency injection?
> 
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
> 
> Or do you want to use DI for other reasons?
> 
> 
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com
> 
> 
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
>  wrote:
>> Hello Community,
>> 
>> Current approach I am using for Spark Job Development with Scala + SBT and
>> Uber Jar with yml properties file to pass configuration parameters. But If i
>> would like to use Dependency Injection and MicroService Development like
>> Spring Boot feature in Scala then what would be the standard approach.
>> 
>> Thanks
>> 
>> Chetan
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Dependency Injection and Microservice development with Spark

2016-12-28 Thread Lars Albertsson
Do you really need dependency injection?

DI is often used for testing purposes. Data processing jobs are easy
to test without DI, however, due to their functional and synchronous
nature. Hence, DI is often unnecessary for testing data processing
jobs, whether they are batch or streaming jobs.

Or do you want to use DI for other reasons?


Lars Albertsson
Data engineering consultant
www.mapflat.com
https://twitter.com/lalleal
+46 70 7687109
Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com


On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
 wrote:
> Hello Community,
>
> Current approach I am using for Spark Job Development with Scala + SBT and
> Uber Jar with yml properties file to pass configuration parameters. But If i
> would like to use Dependency Injection and MicroService Development like
> Spring Boot feature in Scala then what would be the standard approach.
>
> Thanks
>
> Chetan

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Dependency Injection and Microservice development with Spark

2016-12-23 Thread Chetan Khatri
Hello Community,

Current approach I am using for Spark Job Development with Scala + SBT and
Uber Jar with yml properties file to pass configuration parameters. But If
i would like to use Dependency Injection and MicroService Development like
Spring Boot feature in Scala then what would be the standard approach.

Thanks

Chetan