Announcing Spark Packages

2014-12-22 Thread Xiangrui Meng
Dear Spark users and developers,

I’m happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install packages
for any version of Spark, and makes it easy for developers to
contribute packages.

Spark Packages will feature integrations with various data sources,
management tools, higher level domain-specific libraries, machine
learning algorithms, code samples, and other Spark content. Thanks to
the package authors, the initial listing of packages includes
scientific computing libraries, a job execution server, a connector
for importing Avro data, tools for launching Spark on Google Compute
Engine, and many others.

I’d like to invite you to contribute and use Spark Packages and
provide feedback! As a disclaimer: Spark Packages is a community index
maintained by Databricks and (by design) will include packages outside
of the ASF Spark project. We are excited to help showcase and support
all of the great work going on in the broader Spark community!

Cheers,
Xiangrui

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Announcing Spark Packages

2014-12-22 Thread Andrew Ash
Hi Xiangrui,

That link is currently returning a 503 Over Quota error message.  Would you
mind pinging back out when the page is back up?

Thanks!
Andrew

On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:

 Dear Spark users and developers,

 I’m happy to announce Spark Packages (http://spark-packages.org), a
 community package index to track the growing number of open source
 packages and libraries that work with Apache Spark. Spark Packages
 makes it easy for users to find, discuss, rate, and install packages
 for any version of Spark, and makes it easy for developers to
 contribute packages.

 Spark Packages will feature integrations with various data sources,
 management tools, higher level domain-specific libraries, machine
 learning algorithms, code samples, and other Spark content. Thanks to
 the package authors, the initial listing of packages includes
 scientific computing libraries, a job execution server, a connector
 for importing Avro data, tools for launching Spark on Google Compute
 Engine, and many others.

 I’d like to invite you to contribute and use Spark Packages and
 provide feedback! As a disclaimer: Spark Packages is a community index
 maintained by Databricks and (by design) will include packages outside
 of the ASF Spark project. We are excited to help showcase and support
 all of the great work going on in the broader Spark community!

 Cheers,
 Xiangrui

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
Xiangrui asked me to report that it's back and running :)

On Mon, Dec 22, 2014 at 3:21 PM, peng pc...@uowmail.edu.au wrote:
 Me 2 :)


 On 12/22/2014 06:14 PM, Andrew Ash wrote:

 Hi Xiangrui,

 That link is currently returning a 503 Over Quota error message.  Would you
 mind pinging back out when the page is back up?

 Thanks!
 Andrew

 On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:

 Dear Spark users and developers,

 I'm happy to announce Spark Packages (http://spark-packages.org), a
 community package index to track the growing number of open source
 packages and libraries that work with Apache Spark. Spark Packages
 makes it easy for users to find, discuss, rate, and install packages
 for any version of Spark, and makes it easy for developers to
 contribute packages.

 Spark Packages will feature integrations with various data sources,
 management tools, higher level domain-specific libraries, machine
 learning algorithms, code samples, and other Spark content. Thanks to
 the package authors, the initial listing of packages includes
 scientific computing libraries, a job execution server, a connector
 for importing Avro data, tools for launching Spark on Google Compute
 Engine, and many others.

 I'd like to invite you to contribute and use Spark Packages and
 provide feedback! As a disclaimer: Spark Packages is a community index
 maintained by Databricks and (by design) will include packages outside
 of the ASF Spark project. We are excited to help showcase and support
 all of the great work going on in the broader Spark community!

 Cheers,
 Xiangrui

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Announcing Spark Packages

2014-12-22 Thread Hitesh Shah
Hello Xiangrui, 

If you have not already done so, you should look at 
http://www.apache.org/foundation/marks/#domains for the policy on use of ASF 
trademarked terms in domain names. 

thanks
— Hitesh

On Dec 22, 2014, at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:

 Dear Spark users and developers,
 
 I’m happy to announce Spark Packages (http://spark-packages.org), a
 community package index to track the growing number of open source
 packages and libraries that work with Apache Spark. Spark Packages
 makes it easy for users to find, discuss, rate, and install packages
 for any version of Spark, and makes it easy for developers to
 contribute packages.
 
 Spark Packages will feature integrations with various data sources,
 management tools, higher level domain-specific libraries, machine
 learning algorithms, code samples, and other Spark content. Thanks to
 the package authors, the initial listing of packages includes
 scientific computing libraries, a job execution server, a connector
 for importing Avro data, tools for launching Spark on Google Compute
 Engine, and many others.
 
 I’d like to invite you to contribute and use Spark Packages and
 provide feedback! As a disclaimer: Spark Packages is a community index
 maintained by Databricks and (by design) will include packages outside
 of the ASF Spark project. We are excited to help showcase and support
 all of the great work going on in the broader Spark community!
 
 Cheers,
 Xiangrui
 
 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org
 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
Hey Nick,

I think Hitesh was just trying to be helpful and point out the policy
- not necessarily saying there was an issue. We've taken a close look
at this and I think we're in good shape her vis-a-vis this policy.

- Patrick

On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
 Hitesh,

 From your link:

 You may not use ASF trademarks such as Apache or ApacheFoo or Foo in
 your own domain names if that use would be likely to confuse a relevant
 consumer about the source of software or services provided through your
 website, without written approval of the VP, Apache Brand Management or
 designee.

 The title on the packages website is A community index of packages for
 Apache Spark. Furthermore, the footnote of the website reads Spark
 Packages is a community site hosting modules that are not part of Apache
 Spark.

 I think there's nothing on there that would confuse a relevant consumer
 about the source of software. It's pretty clear that the Spark Packages
 name is well within the ASF's guidelines.

 Have I misunderstood the ASF's policy?

 Nick


 On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah hit...@apache.org wrote:

 Hello Xiangrui,

 If you have not already done so, you should look at
 http://www.apache.org/foundation/marks/#domains for the policy on use of ASF
 trademarked terms in domain names.

 thanks
 -- Hitesh

 On Dec 22, 2014, at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:

  Dear Spark users and developers,
 
  I'm happy to announce Spark Packages (http://spark-packages.org), a
  community package index to track the growing number of open source
  packages and libraries that work with Apache Spark. Spark Packages
  makes it easy for users to find, discuss, rate, and install packages
  for any version of Spark, and makes it easy for developers to
  contribute packages.
 
  Spark Packages will feature integrations with various data sources,
  management tools, higher level domain-specific libraries, machine
  learning algorithms, code samples, and other Spark content. Thanks to
  the package authors, the initial listing of packages includes
  scientific computing libraries, a job execution server, a connector
  for importing Avro data, tools for launching Spark on Google Compute
  Engine, and many others.
 
  I'd like to invite you to contribute and use Spark Packages and
  provide feedback! As a disclaimer: Spark Packages is a community index
  maintained by Databricks and (by design) will include packages outside
  of the ASF Spark project. We are excited to help showcase and support
  all of the great work going on in the broader Spark community!
 
  Cheers,
  Xiangrui
 
  -
  To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
  For additional commands, e-mail: dev-h...@spark.apache.org
 


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org



-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Announcing Spark Packages

2014-12-22 Thread Nicholas Chammas
Okie doke! (I just assumed there was an issue since the policy was brought
up.)

On Mon Dec 22 2014 at 8:33:53 PM Patrick Wendell pwend...@gmail.com wrote:

 Hey Nick,

 I think Hitesh was just trying to be helpful and point out the policy
 - not necessarily saying there was an issue. We've taken a close look
 at this and I think we're in good shape her vis-a-vis this policy.

 - Patrick

 On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
 nicholas.cham...@gmail.com wrote:
  Hitesh,
 
  From your link:
 
  You may not use ASF trademarks such as Apache or ApacheFoo or Foo
 in
  your own domain names if that use would be likely to confuse a relevant
  consumer about the source of software or services provided through your
  website, without written approval of the VP, Apache Brand Management or
  designee.
 
  The title on the packages website is A community index of packages for
  Apache Spark. Furthermore, the footnote of the website reads Spark
  Packages is a community site hosting modules that are not part of Apache
  Spark.
 
  I think there's nothing on there that would confuse a relevant consumer
  about the source of software. It's pretty clear that the Spark Packages
  name is well within the ASF's guidelines.
 
  Have I misunderstood the ASF's policy?
 
  Nick
 
 
  On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah hit...@apache.org wrote:
 
  Hello Xiangrui,
 
  If you have not already done so, you should look at
  http://www.apache.org/foundation/marks/#domains for the policy on use
 of ASF
  trademarked terms in domain names.
 
  thanks
  -- Hitesh
 
  On Dec 22, 2014, at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:
 
   Dear Spark users and developers,
  
   I'm happy to announce Spark Packages (http://spark-packages.org), a
   community package index to track the growing number of open source
   packages and libraries that work with Apache Spark. Spark Packages
   makes it easy for users to find, discuss, rate, and install packages
   for any version of Spark, and makes it easy for developers to
   contribute packages.
  
   Spark Packages will feature integrations with various data sources,
   management tools, higher level domain-specific libraries, machine
   learning algorithms, code samples, and other Spark content. Thanks to
   the package authors, the initial listing of packages includes
   scientific computing libraries, a job execution server, a connector
   for importing Avro data, tools for launching Spark on Google Compute
   Engine, and many others.
  
   I'd like to invite you to contribute and use Spark Packages and
   provide feedback! As a disclaimer: Spark Packages is a community index
   maintained by Databricks and (by design) will include packages outside
   of the ASF Spark project. We are excited to help showcase and support
   all of the great work going on in the broader Spark community!
  
   Cheers,
   Xiangrui
  
   -
   To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
   For additional commands, e-mail: dev-h...@spark.apache.org
  
 
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org