#17365: Extend test discovery to include unittest2 test suite runner
-----------------------------------+------------------------------------
Reporter: jezdez | Owner: nobody
Type: New feature | Status: new
Component: Testing framework | Version: SVN
Severity: Normal | Resolution:
Keywords: | Triage Stage: Accepted
Has patch: 0 | Needs documentation: 0
Needs tests: 0 | Patch needs improvement: 0
Easy pickings: 0 | UI/UX: 0
-----------------------------------+------------------------------------
Comment (by carljm):
Replying to [comment:12 russellm]:
> +1 to everything Jacob said, with one caveat: There is a historical
reason for looking for tests in models.py -- that's the way you could
easily find doctests on any of your model methods. I'm not arguing that
this is a reason to include models.py, but given that there was method in
the madness of including models.py in the search path in the first place,
if we choose to remove models.py, we need to make sure the change is
documented.
Certainly I think all changes in behavior, including models.py no longer
being included in test discovery, need to be well documented in the
release notes (as well as updating the testing documentation to match the
new behavior).
> I'm also interested to know how this would impact on any plans to
introduce better support for "suites"; I've had a couple of discussions in
recent times, as well as a long running desire to provide a way to
separate true unit test (e.g., check that contrib.auth works) from
integration tests (e.g., checking that contrib.auth is deployed correctly)
from acceptance tests (e.g., testing that contrib.auth works in practice
against a browser test suite). I haven't given any specific thought to how
this would be implemented, but if we're going to make a change to test
discovery, it would be good to know that the idea has been considered,
even if we dont' deliver on the actual capability.
So my best idea of how this could be implemented would be by making use of
unittest2 skipIf/skipUnless decorators, in combination with an option to
`manage.py test` to set some arbitrary "flags" that the skip decorators
could access. So for instance, a decorator that lets you annotate a test
with `@skipUnlessFlag("integration")`, and `python manage.py test
--flag=integration`. This is just a rough idea with very little research
done - I'd want to take a closer look at how e.g. nose and pytest handle
test annotations, or if unittest2 provides anything already in this
direction that I'm not aware of. Open questions would also include whether
it should just be a generic "test annotation/flag" system, or something
more specifically targeted to split "unit" vs "integration" tests (which
IMO would only make sense if it also came along with some automatic
settings-isolation features for the unit tests, or something).
Anyway, I think that's all pretty much orthogonal to test discovery, which
is what this patch is about. I'd be opposed to any solution for splitting
unit and integration tests that relied on putting the different types of
tests in magical locations; that's the only type of solution I can think
of that would intersect with the concerns of this ticket.
--
Ticket URL: <https://code.djangoproject.com/ticket/17365#comment:14>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.
--
You received this message because you are subscribed to the Google Groups
"Django updates" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/django-updates?hl=en.