+1

I would also strongly suggest that people try out the release against their
own codebases. This has the benefit of ensuring the release won't break
your own code when they go out, and stress-tests the new code against
real-world pipelines. (Ideally our own tests are all passing, and this
validation is automated as much as possible (though ensuring it matches our
documentation and works in a clean environment still has value), but
there's a lot of code and uses out there that we don't have access to
during normal Beam development.)

On Tue, Oct 17, 2023 at 8:21 AM Svetak Sundhar via dev <dev@beam.apache.org>
wrote:

> Hi all,
>
> I’ve participated in RC testing for a few releases and have observed a bit
> of a knowledge gap in how releases can be tested. Given that Beam
> encourages contributors to vote on RC’s regardless of tenure, and that
> voting on an RC is a relatively low-effort, high leverage way to influence
> the release of the library, I propose the following:
>
> During the vote for the next release, voters can document the process they
> followed on a separate document, and add the link on column G here
> <https://docs.google.com/spreadsheets/d/1qk-N5vjXvbcEk68GjbkSZTR8AGqyNUM-oLFo_ZXBpJw/edit#gid=437054928>.
> One step further, could be a screencast of running the test, and attaching
> a link of that.
>
> We can keep repeating this through releases until we have documentation
> for many of the different tests. We can then add these docs into the repo.
>
> I’m proposing this because I’ve gathered the following feedback from
> colleagues that are tangentially involved with Beam: They are interested in
> participating in release validation, but don’t know how to get started.
> Happy to hear other suggestions too, if there are any to address the
> above.
>
> Thanks,
>
>
> Svetak Sundhar
>
>   Data Engineer
> s <nellywil...@google.com>vetaksund...@google.com
>
>

Reply via email to