Thank you both
Steve that's a very interesting point. I have to admit I have never thought
of doing analysis over time on the tests but it makes sense as the failures
over time tell you quite a bit about your data platform
Thanks for highlighting! We are using Pyspark for now so I hope some
I agree the reporting is an important aspect. Sonarqube (or similar tool) can
report over time, but does not support Scala (well indirectly via JaCoCo). In
the end, you will need to think about a dashboard that displays results over
time.
> On 14 Mar 2017, at 12:44, Steve Loughran
On 13 Mar 2017, at 13:24, Sam Elamin
> wrote:
Hi Jorn
Thanks for the prompt reply, really we have 2 main concerns with CD, ensuring
tests pasts and linting on the code.
I'd add "providing diagnostics when tests fail", which is a
Hi Jorn
Thanks for the prompt reply, really we have 2 main concerns with CD,
ensuring tests pasts and linting on the code.
I think all platforms should handle this with ease, I was just wondering
what people are using.
Jenkins seems to have the best spark plugins so we are investigating that
as
Hi,
Jenkins also now supports pipeline as code and multibranch pipelines. thus you
are not so dependent on the UI and you do not need anymore a long list of jobs
for different branches. Additionally it has a new UI (beta) called blueocean,
which is a little bit nicer. You may also check GoCD.
Hi Folks
This is more of a general question. What's everyone using for their CI /CD
when it comes to spark
We are using Pyspark but potentially looking to make to spark scala and Sbt
in the future
One of the suggestions was jenkins but I know the UI isn't great for new
starters so I'd rather