Paul,

Some pages on wiki talk about general guidelines to Dev, but there is no
page stating detailed guidelines for test developers.
I would like to add a new page in QA section for this. Will drop separate
mail for write access to wiki.

Daan,

I will have to check on this. I will find out how we can include static
analysis for tests similar to that it is in place for dev code (findbugs).

Regards,
Gaurav

On Wed, Mar 18, 2015 at 8:22 PM, Daan Hoogland <daan.hoogl...@gmail.com>
wrote:

> good write-up Gaurav, I hope that most of these can be
> validated/verified checkstyle-style. pep8 can of course. Some others
> must remain judged by humanoids, like the one where functions are
> pulled up to higher abstraction layers. Maybe you could shine your
> light on what we can automate? i.e. can pyflakes be added to a target?
>
> > -----Original Message-----
> > From: Gaurav Aradhye [mailto:gaurav.arad...@clogeny.com]
> > Sent: 18 March 2015 07:16
> > To: dev@cloudstack.apache.org
> > Subject: Guidelines for test developers
> >
> > Hello all,
> >
> > Last year after improving Marvin framework, we have been continuously
> spending time in improving old test cases which are written in old style or
> they don't abide to certain guidelines, also which don't use new functions
> available in marvin. Many times a test developer who is adding test case
> for the first time or feature developers adding Basic Validation Tests
> > (BVTs) tend to copy paste the code available in certain test case and
> try to modify it according to feature and commit it. This adds to
> inconsistencies.
> >
> > As and when I touch a file for fixing an issue or adding/editing it, I
> try to incorporate below guidelines and improve the existing code. But
> still few test files are not up to the mark. Our final goal is to have good
> code in every file.So writing this mail to consolidate few rules that
> should be known/considered by everyone adding tests to Marvin. Also, if you
> touch a file, feel free to remove any inconsistencies that are already
> present in the file.
> >
> > *1. Import * should always be avoided*. When I started two years back,
> and tried to understand the code flow/ test cases, I could not easily
> understand from where the particular module is imported. The imports must
> be specific.
> >
> > When the imports are specific, it eliminates the possibility of test
> case failure due to invalid import when specific import is removed from
> dependent module.
> >
> > E.g. If your test case has following import.
> >
> > from A import *
> >
> > And it uses time module which is not imported explicitly in test case
> and comes from module A. Then the test case will start failing when "import
> time"is removed from module A. You certainly don't want this to happen.
> >
> > *2. Maintaining Pep8 standards for python code.*
> >
> > The code is read more often that it is written. Pep8 standards improve
> the readability of the code making it consistent in style. There is a tool
> named "*autopep8*" which you can install with pip install and then you can
> run following command on your test file.
> >
> > autopep8 -i -a -a testFile.py
> >
> > This will make the file pep8 consistent and will also remove the white
> spaces. But some issues need human intervention and can't be fixed with
> tool. For fixing those, check the pep8 issues with "pep8 testFile.py" and
> fix manually.
> >
> > *3. Keep only imports which are used* in the test suite and remove
> unwanted imports.
> >
> > *4. Keep all the configuration parameters* (such as data which is passed
> to API while creating Network offering, Service offering, account etc...)
> *in tools/marvin/marvin/config/test_data.py file*. Don't include them in
> test suite itself.
> >
> > Many of the dictionaries are reusable and if you are adding a new test,
> there are only a few dictionaries you will have to add in the file.
> >
> > If any of the data contains URLs or any data which should be changed
> according to setup/env, then include the dict in "*configurableData*"
> > section in test_data.py file. This makes it easier to identify which
> data needs to be dynamic according to the setup and which data doesn't need
> to be touched when env is changed.
> >
> > *5. Before committing a test case, run it* with the latest branch
> against which you are adding the test case and attach the results in Pull
> Request.
> > If in case change is very small, and doesn't need to be run, then at
> least check syntactical errors with python command and also with the help
> of tools such as pyflakes.
> >
> > 6. If you add a new function in your test case and you think it can be
> used in future by other test cases, then please *add that function to
> common or utils file* in Marvin. Don't keep it local to test case. This
> will prevent multiple contributors adding same functions in their test case
> to achieve a particular goal.
> >
> > *7. Please make sure all the resources created through the test cases
> are deleted *when test case execution completes, or even when the test case
> fails.
> >
> > 8. If same test case is to be run with different configuration or
> setting, you can *make use of ddt library*. For example, if you have added
> test case for isolated networks, and you need to run the same code for
> shared and VPC networks, then you don't need to add 3 test cases. Just add
> relevant tags to the test case and you are good to go. Although you will
> need to write code for handling those tags. It is already used in few test
> cases. A simple grep over component folder and you can see how it is used.
> >
> > This blog explains how it works.
> >
> https://technomilk.wordpress.com/2012/02/12/multiplying-python-unit-test-cases-with-different-sets-of-data/
> >
> > I will check if this is in any wiki page currently, and edit it. Or will
> add a new page.
> > I hope everyone adding test cases follows above guidelines. Feel free to
> add more.
> >
> > Regards,
> > Gaurav
>
> --
> Daan
>

Reply via email to