+1 I am looking forward to it too. ---- On Fri, 02 Dec 2022 23:12:36 -0800  
j...@nanthrax.net  wrote ----Hi,

yes, I think docker (and eventually k8s yaml files or helm chart) is way better.

I think we could include it in the project and update the started
guide accordingly.

Regards
JB

On Sat, Dec 3, 2022 at 2:44 AM larry mccay <lmc...@apache.org> wrote:
>
> Yes, the docker based approach is a great way.
> @Damon - this sounds terrific - look forward to it next week!
>
>
>
> On Fri, Dec 2, 2022 at 8:20 PM Damon Cortesi <dac...@apache.org> wrote:
>
> > Coming back to this as I get my dev environment up and running, there's
> > definitely an intermix of dependencies between Spark, Python, and R that
> > I'm still working out.
> >
> > For example, when I try to start sparkR I get an error message that
> > "package ‘SparkR’ was built under R version 4.0.4", but locally I have R
> > version 3.5.2 installed. Spark 3.3.1 says you need R 3.5+. That said, think
> > my version of R works with Spark2 (at least the tests indicate that...)
> >
> > It'd be great to have a minimum viable environment with specific versions
> > and I hope to have that in a Docker environment by early next week. :)
> >
> > Currently I'm just basing it off a debian image with Java8, although there
> > are Spark images that could be useful...
> >
> > Damon
> >
> > On 2022/11/20 18:55:35 larry mccay wrote:
> > > Considering there is no download for anything older than 3.2.x on the
> > > referred download page, we likely need some change to the README.md to
> > > reflect a more modern version.
> > > We also need more explicit instructions for installing Spark than just
> > the
> > > download. Whether we detail this or point to Spark docs that are
> > sufficient
> > > is certainly a consideration.
> > >
> > > At the end of the day, we are missing any sort of quick start guide for
> > > devs to be able to successfully build and/or run tests.
> > >
> > > Thoughts?
> > >
> > > On Sat, Nov 19, 2022 at 6:23 PM larry mccay <larry.mc...@gmail.com>
> > wrote:
> > >
> > > > Hey Folks -
> > > >
> > > > Our Livy README.md indicates the following:
> > > >
> > > > To run Livy, you will also need a Spark installation. You can get Spark
> > > > releases at https://spark.apache.org/downloads.html.
> > > >
> > > > Livy requires Spark 2.4+. You can switch to a different version of
> > Spark
> > > > by setting the SPARK_HOME environment variable in the Livy server
> > > > process, without needing to rebuild Livy.
> > > >
> > > > Do we have any variation on this setup at this point in the real world?
> > > >
> > > > What do your dev environments actually look like and how are you
> > > > installing what versions of Spark as a dependency?
> > > >
> > > > Thanks!
> > > >
> > > > --larry
> > > >
> > >
> >

Reply via email to