> - improve simulator for runtime testability
> - customize to model and inject failures
> - make a habit of writing tests around bug reports (I started tagging tests
> since api_refactoring on JIRA already.
> look for the integration-test label)
> - make integration testing easier using factories and DSLs (from
>   Chiradeep)

Sorry for long  thread again. All my opinion is "don't distract from tools, 
improving
Marvin should be good enough". Below are points that you can simply skip.

I used to create a DSL after studying "Domain Specific Language" (by Martin 
Fowler), using Python.
>From my experience, I feel it's not suitable for CloudStack integration test.
Here are points:

1. Syntax puzzle
The DSL I created is known as "internal DSL" which builds on syntax of host 
language,
"internal DSL" saves your time to write parser but limited by host language's 
ability. For example:

Target(
              name='deploy',
              actions = [
                   Action(id='zone', name='createZone', 
args=IN(name='testZone')),
   
                   Action(id='pod', name='createPod', args=IN(name='testPod', 
zoneUuid=O('zone').uuid)),
              ],
   
             depends = ['deploydb'],
)

RunTarget('deploy') 

This is an example my DSL applying to CloudStack test, you see a few syntaxes 
which are not intuitive.
Actually with a well-defined library the case could be as simple as:

import actions 
import nose

@nose.time(600)
def test_deploy():
      zone = actions.createZone(name='testZone')  
      pod = actions.createPod(name='testPod', zoneUuid=zone.uuid)

test_deploy()

which is very pythonic every Python programmer can easily understand.
Maybe I am an inexperience DSL writer, but syntax puzzle is common problem for 
DSL, I really think we gain little
by introducing such thing into our test framework

2. capability
DSL varying from common language usually lacks loop, condition statement, and 
sometimes variable. But 
these things are very important to our test. if you strip these abilities from 
your DSL, I believe you will finally 
add them back that ends up with writing a common language, then you have been 
distracted much from writing
test case

3. it's not our business now
The urgent need for CloudStack now is a set of reliable test cases, which I 
think could be done by improving current
Marvin. Building our DSL distracts us at this point. 

4. you don't even need to search a test framework for this time being
When I studying DSL, I ever looked into some famous test framework. For 
Behavior Driven Test I looked Cucumber(Ruby),
for Model Driven Test I looked fpmt(the name maybe wrong, written by a French). 
They both use DSL. My conclusion is the
benefit of DSL is for stuff that has nothing to do with test logic itself. For 
example, driving test cases, producing document/report,
scheduling random test case combination ...

they are nice to have, but not urgent. So for our purpose, I would suggest 
using  python nose which is a unit test case framework but
can still use for integration test. simple, quick, and easy.


> (part 1 of this work was started on this in the marvin-refactor
> branch)
> 
> More comments inline
> 
> On Wed, Mar 06, 2013 at 12:12:11AM +0530, Alex Huang wrote:
> > Hi All,
> >
> > As most of you are aware, the master branch keeps getting broken by
> > checkins for various reasons.  Committers need to be more responsible
> > about their checkins but I don't think we can depend on that
> > happening.  There are various reasons.  The most obvious to me is that
> > granting committership is not based on code competency.
> > (And I don't think it should.)  Given that we need to build a BVT
> > system for ensure that checkins do not break the branch.
> >
> > Here's my proposal:
> >
> > Existing components that we'll use.
> >
> > -          Citrix has contributed its testing to Apache.
> >
> > -          Apache CloudStack has already a simulator that's been used for 
> > scale
> testing.
> >
> > -          Marvin
> >
> > -          DevCloud-kvm
> >
> > Work Proposal:
> >
> > -          Convert the Citrix testing into three phases:
> >
> > o   Setup
> >
> > o   Test
> >
> > o   Verify
> 
> I do the build, package, setup, test, verify in my integration test pipeline 
> and
> a similar pipeline for a developer combined into easily runnable maven
> profiles/lifecycles/goals would be great to have.
> 
> > -          Add a Setup and Verify phase for the simulator
> > -          Add all of the agent commands necessary for the simulator to pass
> the testing.
> > -          Add a Setup and Verify phase for devCloud-kvm
> 
> The setups exists as configs in the marvin sandbox already. Just need to hook
> it up with mvn. For verify there is a simple python script testSetupSuccess.py
> already that checks two things
> 
> - system VMs are up
> - built-in templates are downloaded
> 
> This should be a good start IMO.
> 
> Currently devcloud-kvm is a bit hard to run from a developer environment.
> But it's great to have in a continuous environment backed by a KVM host
> with a Linux 3.0 kernel. Marcus has written up some pretty good
> documentation for this. If someone can help bring up that setup I can assist
> in bringing up the tests using my devcloud-ci scripts. I'm bringing up 
> devcloud
> after Kelven 'alerted' us of the memory fix.
> 
> >
> > -          Add two more profiles to pom
> > o   Checkin-test-with-simulator: Runs the testing against the simulator
> > o   Checkin-test-with-devCloud: Runs the testing against devcloud
> >
> > -          All of the profiles will attempt to also check the merge list 
> > that Chip
> has proposed.'
> 
> > -          We will also change marvin to easily add zones with
> > actual hardware.  It will be based on a data driven document to do the
> > setup.
> This is currently partly doable using a properties file in the sandbox $ 
> python
> advanced_env.py -i xen.properties -o xen.cfg
> 
> gives out a advanced zone config for the properties specified.
> Is data driven similar or you are talking about a DSL that Edison mentioned at
> CCC12?
> 
> >
> > For a developer to checkin:
> >
> > -          S/he must writes marvin tests for their feature and add
> > it to the BVT.
> I made some progress on this in my marvin-refactor branch and will collect
> my thoughts in an FS that I am drafting. The marvin tests are difficult to 
> write
> in their current form. I'm following Chiradeep's recommendation for
> providing factories and once those are baked a DSL language devs can quickly
> mock down tests against a simulator using the above mentioned mvn profiles.
> 
> >
> > -          S/he must run the checkin tests to verify everything works.
> > -          If the feature contains a hardware/vpx component, simulation must
> be added.
> >
> > At this point, everything is about the developer writing in their feature
> branch and merging in.
> >
> > On infrastructure side:
> >
> > -          We'll setup continuous BVT based on the simulator.
> I brought this up on the IRC and the list y'day, so +1 - happy to help
> >
> > -          I again push that we must use Gerrit to test the code
> > before it gets merge into the branch but I'll leave that for someone
> > else to do that.
> >
> > Let me know what you guys think.  I'll probably break out a bvt branch
> > to work on this.  Anyone want to join me?
> 
> Count me in!
> 
> --
> Prasanna.,

Reply via email to