[repost to mailing list]

I don't know much about packages, but have you heard about the
sbt-spark-package plugin?
Looking at the code, specifically
https://github.com/databricks/sbt-spark-package/blob/master/src/main/scala/sbtsparkpackage/SparkPackagePlugin.scala,
might give you insight on the details about package creation. Package
submission is implemented in
https://github.com/databricks/sbt-spark-package/blob/master/src/main/scala/sbtsparkpackage/SparkPackageHttp.scala

At a quick first overview, it seems packages are bundled as maven artifacts
and then posted to "http://spark-packages.org/api/submit-release";.

Hope this helps for your last question

On 16 October 2015 at 08:43, jeff saremi <jeffsar...@hotmail.com> wrote:

> I'm looking for any form of documentation on Spark Packages
> Specifically, what happens when one issues a command like the following:
>
>
> $SPARK_HOME/bin/spark-shell --packages RedisLabs:spark-redis:0.1.0
>
>
> Something like an architecture diagram.
> What happens when this package gets submitted?
> Does this need to be done each time?
> Is that package downloaded each time?
> Is there a persistent cache on the server (master i guess)?
> Can these packages be installed offline with no Internet connectivity?
> How does a package get created?
>
> and so on and so forth
>

Reply via email to