Re: spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?

On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende 
wrote:

>
> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:
>
>> +1000
>>
>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>> since I've been struggling with a sbt-based Scala project for a Spark
>> package myself this week and haven't yet found out how to do local
>> publishing.
>>
>> If such a guide existed for Maven I could use it for sbt easily too :-)
>>
>> Ping me Ismael if you don't hear back from the group so I feel invited
>> for digging into the plugin's sources.
>>
>> Best,
>> Jacek
>>
>> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>>
>> Hello, I would like to know if there is an easy way to package a new
>> spark-package
>> with maven, I just found this repo, but I am not an sbt user.
>>
>> https://github.com/databricks/sbt-spark-package
>>
>> One more question, is there a formal specification or documentation of
>> what do
>> you need to include in a spark-package (any special file, manifest, etc)
>> ? I
>> have not found any doc in the website.
>>
>> Thanks,
>> Ismael
>>
>>
>>
>
> I was under the impression that spark-packages was more like a place for
> one to list/advertise their extensions,  but when you do spark submit with
> --packages, it will use maven to resolve your package
> and as long as it succeeds, it will use it (e.g. you can do mvn clean
> install for your local packages, and use --packages with a spark server
> running on that same machine).
>
> From sbt, I think you can just use publishTo and define a local
> repository, something like
>
> publishTo := Some("Local Maven Repository" at 
> "file://"+Path.userHome.absolutePath+"/.m2/repository")
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: spark-packages with maven

2016-07-15 Thread Luciano Resende
On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>

I was under the impression that spark-packages was more like a place for
one to list/advertise their extensions,  but when you do spark submit with
--packages, it will use maven to resolve your package
and as long as it succeeds, it will use it (e.g. you can do mvn clean
install for your local packages, and use --packages with a spark server
running on that same machine).

>From sbt, I think you can just use publishTo and define a local repository,
something like

publishTo := Some("Local Maven Repository" at
"file://"+Path.userHome.absolutePath+"/.m2/repository")



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: spark-packages with maven

2016-07-15 Thread Burak Yavuz
Hi Ismael and Jacek,

If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark Package requires.

Please contact me directly if you have any issues.

Best,
Burak

On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>


Re: spark-packages with maven

2016-07-15 Thread Jacek Laskowski
+1000

Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.

If such a guide existed for Maven I could use it for sbt easily too :-)

Ping me Ismael if you don't hear back from the group so I feel invited for
digging into the plugin's sources.

Best,
Jacek

On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:

Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.

https://github.com/databricks/sbt-spark-package

One more question, is there a formal specification or documentation of what
do
you need to include in a spark-package (any special file, manifest, etc) ? I
have not found any doc in the website.

Thanks,
Ismael


Re: spark packages

2015-05-24 Thread Debasish Das
Yup netlib lgpl right now is activated through a profile...if we can reuse
the same idea then csparse can also be added to spark with a lgpl flag. But
again as Sean said its tricky. Better to keep it on spark packages for
users to try.
On May 24, 2015 1:36 AM, Sean Owen so...@cloudera.com wrote:

 I dont believe we are talking about adding things to the Apache project,
 but incidentally LGPL is not OK in Apache projects either.
 On May 24, 2015 6:12 AM, DB Tsai dbt...@dbtsai.com wrote:

 I thought LGPL is okay but GPL is not okay for Apache project.

 On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote:

 Yes - spark packages can include non ASF licenses.

 On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
 wrote:
  Hi,
 
  Is it possible to add GPL/LGPL code on spark packages or it must be
 licensed
  under Apache as well ?
 
  I want to expose Professor Tim Davis's LGPL library for sparse algebra
 and
  ECOS GPL library through the package.
 
  Thanks.
  Deb

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org



 --
 Sent from my iPhone




Re: spark packages

2015-05-24 Thread Sean Owen
I dont believe we are talking about adding things to the Apache project,
but incidentally LGPL is not OK in Apache projects either.
On May 24, 2015 6:12 AM, DB Tsai dbt...@dbtsai.com wrote:

 I thought LGPL is okay but GPL is not okay for Apache project.

 On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote:

 Yes - spark packages can include non ASF licenses.

 On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
 wrote:
  Hi,
 
  Is it possible to add GPL/LGPL code on spark packages or it must be
 licensed
  under Apache as well ?
 
  I want to expose Professor Tim Davis's LGPL library for sparse algebra
 and
  ECOS GPL library through the package.
 
  Thanks.
  Deb

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org



 --
 Sent from my iPhone



Re: spark packages

2015-05-23 Thread Patrick Wendell
Yes - spark packages can include non ASF licenses.

On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com wrote:
 Hi,

 Is it possible to add GPL/LGPL code on spark packages or it must be licensed
 under Apache as well ?

 I want to expose Professor Tim Davis's LGPL library for sparse algebra and
 ECOS GPL library through the package.

 Thanks.
 Deb

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: spark packages

2015-05-23 Thread DB Tsai
I thought LGPL is okay but GPL is not okay for Apache project.

On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote:

 Yes - spark packages can include non ASF licenses.

 On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
 javascript:; wrote:
  Hi,
 
  Is it possible to add GPL/LGPL code on spark packages or it must be
 licensed
  under Apache as well ?
 
  I want to expose Professor Tim Davis's LGPL library for sparse algebra
 and
  ECOS GPL library through the package.
 
  Thanks.
  Deb

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org javascript:;
 For additional commands, e-mail: dev-h...@spark.apache.org javascript:;



-- 
Sent from my iPhone


Re: spark packages

2015-05-23 Thread Reynold Xin
That's the nice thing about Spark packages. It is just a package index for
libraries and applications built on top of Spark and not part of the Spark
codebase, so it is not restricted to follow only ASF-compatible licenses.


On Sat, May 23, 2015 at 10:12 PM, DB Tsai dbt...@dbtsai.com wrote:

 I thought LGPL is okay but GPL is not okay for Apache project.


 On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote:

 Yes - spark packages can include non ASF licenses.

 On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
 wrote:
  Hi,
 
  Is it possible to add GPL/LGPL code on spark packages or it must be
 licensed
  under Apache as well ?
 
  I want to expose Professor Tim Davis's LGPL library for sparse algebra
 and
  ECOS GPL library through the package.
 
  Thanks.
  Deb

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org



 --
 Sent from my iPhone