RE: Jars directory in Spark 2.0

2017-02-01 Thread Sidney Feiner
Ok, good to know ☺
Shading every spark app it is then…
Thanks!

Sidney Feiner   /  SW Developer
M: +972.528197720  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Marcelo Vanzin [mailto:van...@cloudera.com]
Sent: Wednesday, February 1, 2017 7:41 PM
To: Sidney Feiner <sidney.fei...@startapp.com>
Cc: Koert Kuipers <ko...@tresata.com>; user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0

Spark has never shaded dependencies (in the sense of renaming the classes), 
with a couple of exceptions (Guava and Jetty). So that behavior is nothing new. 
Spark's dependencies themselves have a lot of other dependencies, so doing that 
would have limited benefits anyway.

On Tue, Jan 31, 2017 at 11:23 PM, Sidney Feiner 
<sidney.fei...@startapp.com<mailto:sidney.fei...@startapp.com>> wrote:
Is this done on purpose? Because it really makes it hard to deploy 
applications. Is there a reason they didn't shade the jars they use to begin 
with?

Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Koert Kuipers [mailto:ko...@tresata.com<mailto:ko...@tresata.com>]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner 
<sidney.fei...@startapp.com<mailto:sidney.fei...@startapp.com>>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Jars directory in Spark 2.0

you basically have to keep your versions of dependencies in line with sparks or 
shade your own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to update 
them you have to build spark yourself with updated dependencies and confirm it 
compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner 
<sidney.fei...@startapp.com<mailto:sidney.fei...@startapp.com>> wrote:
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come 
preloaded with Spark in the "jars/" directory and I had to shade most of my 
packages.
Can I replace the jars in this folder to more up to date versions? Are those 
jar used for anything internal in Spark which means I can't blindly replace 
them?

Thanks ☺


Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

 <http://www.startapp.com/press/#events_press>



--
Marcelo


Re: Jars directory in Spark 2.0

2017-02-01 Thread Marcelo Vanzin
Spark has never shaded dependencies (in the sense of renaming the classes),
with a couple of exceptions (Guava and Jetty). So that behavior is nothing
new. Spark's dependencies themselves have a lot of other dependencies, so
doing that would have limited benefits anyway.

On Tue, Jan 31, 2017 at 11:23 PM, Sidney Feiner <sidney.fei...@startapp.com>
wrote:

> Is this done on purpose? Because it really makes it hard to deploy
> applications. Is there a reason they didn't shade the jars they use to
> begin with?
>
>
>
> *Sidney Feiner*   */*  SW Developer
>
> M: +972.528197720 <+972%2052-819-7720>  */*  Skype: sidney.feiner.startapp
>
>
>
> [image: StartApp] <http://www.startapp.com/>
>
>
>
> *From:* Koert Kuipers [mailto:ko...@tresata.com]
> *Sent:* Tuesday, January 31, 2017 7:26 PM
> *To:* Sidney Feiner <sidney.fei...@startapp.com>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Jars directory in Spark 2.0
>
>
>
> you basically have to keep your versions of dependencies in line with
> sparks or shade your own dependencies.
>
> you cannot just replace the jars in sparks jars folder. if you wan to
> update them you have to build spark yourself with updated dependencies and
> confirm it compiles, passes tests etc.
>
>
>
> On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner <sidney.fei...@startapp.com>
> wrote:
>
> Hey,
>
> While migrating to Spark 2.X from 1.6, I've had many issues with jars that
> come preloaded with Spark in the "jars/" directory and I had to shade most
> of my packages.
>
> Can I replace the jars in this folder to more up to date versions? Are
> those jar used for anything internal in Spark which means I can't blindly
> replace them?
>
>
>
> Thanks J
>
>
>
>
>
> *Sidney Feiner*   */*  SW Developer
>
> M: +972.528197720 <+972%2052-819-7720>  */*  Skype: sidney.feiner.startapp
>
>
>
> [image: StartApp] <http://www.startapp.com/>
>
>
>
> <http://www.startapp.com/press/#events_press>
>
>   <http://www.startapp.com/press/#events_press>
>



-- 
Marcelo


RE: Jars directory in Spark 2.0

2017-01-31 Thread Sidney Feiner
Is this done on purpose? Because it really makes it hard to deploy 
applications. Is there a reason they didn't shade the jars they use to begin 
with?

Sidney Feiner   /  SW Developer
M: +972.528197720  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Koert Kuipers [mailto:ko...@tresata.com]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner <sidney.fei...@startapp.com>
Cc: user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0

you basically have to keep your versions of dependencies in line with sparks or 
shade your own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to update 
them you have to build spark yourself with updated dependencies and confirm it 
compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner 
<sidney.fei...@startapp.com<mailto:sidney.fei...@startapp.com>> wrote:
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come 
preloaded with Spark in the "jars/" directory and I had to shade most of my 
packages.
Can I replace the jars in this folder to more up to date versions? Are those 
jar used for anything internal in Spark which means I can't blindly replace 
them?

Thanks ☺


Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

<http://www.startapp.com/press/#events_press>
 <http://www.startapp.com/press/#events_press>


Re: Jars directory in Spark 2.0

2017-01-31 Thread Koert Kuipers
you basically have to keep your versions of dependencies in line with
sparks or shade your own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to
update them you have to build spark yourself with updated dependencies and
confirm it compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner 
wrote:

> Hey,
>
> While migrating to Spark 2.X from 1.6, I've had many issues with jars that
> come preloaded with Spark in the "jars/" directory and I had to shade most
> of my packages.
>
> Can I replace the jars in this folder to more up to date versions? Are
> those jar used for anything internal in Spark which means I can't blindly
> replace them?
>
>
>
> Thanks J
>
>
>
>
>
> *Sidney Feiner*   */*  SW Developer
>
> M: +972.528197720 <+972%2052-819-7720>  */*  Skype: sidney.feiner.startapp
>
>
>
> [image: StartApp] 
>
>
> [image: Meet Us at] 
>