Thanks Florian for taking the time to create this transcript! :)

On Tue, Dec 15, 2015 at 7:42 PM Florian Bruhin <m...@the-compiler.org> wrote:

> We had an (experimental) Google Hangout meeting with some core
> developers today - here are the notes I took, completely unedited
> because I want to go to sleep ;)
>
> If I missed or misrepresented something, please let me know - sorry!
>
> pytest hangout meeting
> ======================
>
> people
> ------
>
> - Bruno/nicoddemus
> - Holger/hpk
> - Florian/The-Compiler
> - Floris/flub
> - abusalimov (pytest-catchlog contributor)
> - Ronny/ronny
>
> ronny: meta-organization
> ------------------------
>
> ronny proposes to have a meta-organization to have an issue tracker and
> wiki to
> collect tasks/informations involving more than one pytest-dev project.
> Something similar to metaflask.
>
> Meta-things like how pytest-dev work, how to release those, etc.
>
> Also collect infos about the members of the project.
>
> -> Task workflow for management tasks which would feel clunky on a ML
>    https://github.com/pocoo/metaflask
>
>
> Bruno sees some value as there's a plugin index thing there.
>
> flub asks if this isn't basically the same than using a prefix on PyPI.
>
> ronny/bruno: There's much more information than you could have on PyPI, you
> could add information like compatibility etc.
>
> flub: Let's try it - worried that it's another data source to get out of
> data
>
> ronny: We should re-evaluate it in 6-12 months and see how well it worked.
>
> hpk: plugincompat is already there and interesting - why not just add a
> 'people' page on the website?
>
> flo: I think having a machine-readable represenation of plugins and people
> doesn't make sense, even metaflask is a graveyard.
>
> flub: We already have an overview of recommended plugins in the docs, as
> for
> logging, just let's clarify there.
>
> ronny: but what about cross-project issues, I don't want those in the
> pytest
> tracker. e.g. a plugin moving to github. I'd like to see a separate bug
> tracker
> for ecosystem tasks.
>
> hpk: Why not just a 'cross' label for pytest issues. I don't think it's
> worth
> it to have a repo just for issue tracking? The issues also are more
> visibile
> like currently.
>
> flo: I don't see why pytest (as an ecosystem) is concerned about other
> projects
> moving.
>
> bruno: Other than movements, what do you think should be there?
>
> ronny: e.g. maintainer applications or moving to pytest-dev?
>
> hpk: I think that's fine on the mailinglist (flo points out it's documented
> that way already)
>
> ronny: okay, let's do labels for issues then and use the ML for other
> stuff.
>
>
> hpk: sprint
> -----------
>
> What about a pytest sprint in spring and doing crowd-funding for travel?
>
> ronny: he'd like to ask one of the Plone maintainers who's interested in
> pytest
> and experienced in setting up those sprints.
>
> everyone likes the idea!
>
> ronny: moving other projects to github
> --------------------------------------
>
> ronny is currently working on a better import from bitbucket to github.
> ronny would like to migrate stuff from bb to gh
>
> flub: the maintainer of that would definitely need to agree, ronny agrees.
>
> -> move every project where the maintainers agree to github
>
> ronny: automated releases
> -------------------------
>
> ronny: pushing signed tag to github -> get release uploaded to PyPI
>
> hpk: that's the golden aim, but regendoc is still not completely automated.
>
> ronny: I want to work on that, with Travis creating a PR every time
> something
> changes via regendoc.
>
> florian/ronny: Only signed tags should cause releases!
>
> hpk: I'm still very sceptical about that - it often takes a couple of
> tries to
> release something.
>
> ronny: but you can create rc tags, etc. etc. until things look right.
>
> hpk: I like to create an artefact, test that, and release *exactly that*.
> With
> a tag workflow you lose that ability.
>
> ronny: But it would with reducable builds. The build system should create
> the
> same artefact.
>
> hpk: Let's do a PR against howtorelease. I want an automated process, I
> just
> don't think tagging is the right way to do that.
>
> -> The work is a good idea either way, no matter whether we trigger the
> process
> via tags or another mechanism.
>
> bruno: At work we create a release branch, that branch generates artefacts
> which aren't published yet. I can *manually* publish and tag it then.
>
> flo: I'd like it most if *testing* was automated.
>
> bruno: What about using devpi via travis?
>
> ronny: We could have a release branch and then make travis do something
> different there.
>
> general idea: PR -> release branch -> travis uploads a release somewhere
> and
> tests it -> you can publish that artifact
>
> holger: I'm doing releases for 7-8 projects - we do want to be sure wheels
> are
> working properly. wheels already were broken earlier.
>
> flub: But that works already with an option? I manually build with
> setup.py and
> use tox to test those?
>
> ==> we can all agree on automating steps and the general idea
>
> holger: other issues are the changelog or merging features/master. Usually
> when
> I merge something into master I merge it into features as well.
>
> holger: What about doing auto-merges via bot of master -> features
>
> flo: I think this is too noisy, what about weekly or so?
>
> holger: We don't really have a way to do this systematically so far.
>
> bruno: Why not just before every master/bugfix release? If we fix a bug
> we'll
> do a release anyways, so why not do it then?
>
> (holger leaves)
>
> ronny: subtests
> ---------------
>
> new concept introduced to unittest.py with py3 - you can have sub-tests
> which
> means the test still continues when an assertion fails.
>
> this would allow to create "sections" with proper names (and reporting) for
> larger tests.
>
> flub: So what's the challenge about that?
>
> [I couldn't really follow, so no logs here, sorry]
>
> ronny: capturing
> ----------------
>
> ronny: We currently replace sys.std... objects and replace them by
> capturing
> ones. I'd like to add a kind of capturing where the capturing captures the
> whole test run.
>
> [I couldn't really follow, so no logs here, sorry]
>
> --> let's move it to the ML because it's better when written
>
> deferred until later
> --------------------
>
> - unittests in pytest testsuite
> - DI framework
> - 3.x plans
>
> -- Florian
>
> --
> http://www.the-compiler.org | m...@the-compiler.org (Mail/XMPP)
>    GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc
>          I love long mails! | http://email.is-not-s.ms/
> _______________________________________________
> pytest-dev mailing list
> pytest-dev@python.org
> https://mail.python.org/mailman/listinfo/pytest-dev
>
_______________________________________________
pytest-dev mailing list
pytest-dev@python.org
https://mail.python.org/mailman/listinfo/pytest-dev

Reply via email to