Agreed, it is good to have end-to-end tests to define something as
"done". My thought was that these tests are often a low priority and
become technical debt. Seems like a good task for post iteration -
someone else mentioned technical debt as well. I wish I didn't ever
have any technical debt, but such a project hasn't come my way. :)

Erick

On Wed, Dec 22, 2010 at 3:48 PM, Ade Miller <[email protected]> wrote:
> Hi Erick,
>
>
>
> You can run some end to end integration tests as part of CI. You might want
> to automatically deploy your product to a test environment as part of this
> process. You may still require additional testing; automated or manual, but
> the implication is I can always build my product and have a clear idea of
> what’s “done”. This means that I can start testing while I’m still working
> on other features during the iteration. There is very little waiting until
> the end to see if it all works.
>
>
>
> I’m not saying there aren’t going to be exceptions to this rule. There are
> some things my teams leave “until the end” but in general anything you leave
> off your “done” list to the end is an unknown so represents risk.
>
>
>
> Ad
>
>
>
>
>
> From: [email protected] [mailto:[email protected]]
> On Behalf Of Erick Thompson
> Sent: Wednesday, December 22, 2010 3:26 PM
> To: [email protected]
> Cc: [email protected]
> Subject: Re: After the Code is Done - Strategies to Ensure Quality
>
>
>
> What about integration testing? TDD and CI help for building solid code, but
> there is also a need to test everything together. Perhaps it is UI whitebox
> testing, or some other style if tests that don't tend to get developed in
> the normal course of an iteration.
>
>
>
> Erick
>
> Sent from my iPhone
>
> On Dec 22, 2010, at 1:11 PM, Anne Wax <[email protected]> wrote:
>
> Hi all and thank you.  There's lots to think about here, and I will reread
> later tonight.
>
>
>
> I wanted to answer some of your questions and key points.
>
>
>
> By end of an iteration I meant end of a sprint, the end of finishing a set
> of  features to address a business need that takes longer than one
> sprint, and at the point when a version of the software is released for
> Production use (vs for testing/pilot use).  We'll see if these designations
> open up a whole new can of worms regarding what is 'agile' or not 'agile'.
>
>
>
> re: the real world and ideal - it is not about giving up, but rather having
> processes that build in the ideal (refactor as you go) and recognize that
> sometimes you have to deliver, and then finish.  For example, on a larger
> project you might have multiple sprints to build a significant piece of
> functionality, but then as you are finishing, you think of a better way to
> do it.  The schedule does not allow the immeidate refactoring, but it is
> needed.  Or what about when the installation or training documentation is
> hard to write as you go until the product has been developed.
>
>
>
> I am coming from the perspective of a large enterprise system used by over
> 20 customers with set delivery schedules/dates.  Yet we are using many agile
> tools such as scrum, daily builds, teams of testers and developers working
> together on each sprint, unit and web testing, refactoring, etc.  And we are
> trying to do a better job all the time.
>
>
>
> Thanks for all your input.  I enjoy these discussions and am often watching
> the conversations.
>
>
>
> Anne
>
> On Wed, Dec 22, 2010 at 7:52 AM, Ade Miller <[email protected]>
> wrote:
>
> Hum...
>
>
>
> So the title of your message concerns me. Don't think in terms of codeing
> and testing, think engineering. Ensuring quality at the end is doomed to
> failure.
>
>
>
> Assure functional quality during development. For example:
>
> TDD or unit testing tend to get rid of lots of defects before they are even
> committed to the codebase.
> Continuous integration ensures that your product builds as a whole
> throughout development.
> Having a clear definition of "done" means that acceptance testing can start
> on finished stories during the iteration.
> An iteration/sprint should be long enough to engineer (build and test) a
> useful peice of functionality. Not coding sprints followed by test sprints.
>
> All this adds up to not having a lot of quality debt at the end of an
> iteration.
>
>
>
> Specifically on the what do you do at then end of an iteration. Typically
> the team gets together and asks; "What went well?", "What went not so well?"
> and "What things could we try and improve next iteration?" These may be
> related to the quality of the product but might equally concern any other
> aspect of building software. The team picks a couple of the top things they
> think they could improve and works on them during the next iteration, over
> time these small improvements add up. This is reflection-adaption but it
> applies to all aspects of what you do not just product quality.
>
>
>
> Thanks,
>
> Ade
>
>
>
> ________________________________
>
> From: [email protected] [[email protected]] on
> behalf of Anne Wax [[email protected]]
> Sent: Wednesday, December 22, 2010 7:39 AM
> To: [email protected]
>
> Subject: After the Code is Done - Strategies to Ensure Quality
>
>
>
> What do you do at the end of a sprint or a release cycle to ensure quality?
>
>
>
> We've seen some blogs that talk about stop-reflect-adapt and
> review-reflect-repeat.  What do you all do when you have completed an
> interation or a release cycle to ensure your product's excellence?  Do you
> step back to review and improve before moving on to the next cycle or
> project?
>
>
>
> What happens in "real life" and what is the ideal?
>
>
>
> Thank you,
>
>
>
> Anne
>
>
>
> http://www.agileweboperations.com/stop-reflect-adapt-the-3-steps-to-stop-writing-bad-code
>
>
>
> http://www.agilejournal.com/blogs/blogs/all-about-agile/704-how-to-implement-scrum-in-10-easy-steps-step-10-review-reflect-repeat
>
> --
> You received this message because you are subscribed to the Google Groups
> "Seattle area Alt.Net" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/altnetseattle?hl=en.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Seattle area Alt.Net" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/altnetseattle?hl=en.
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Seattle area Alt.Net" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/altnetseattle?hl=en.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Seattle area Alt.Net" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/altnetseattle?hl=en.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Seattle area Alt.Net" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/altnetseattle?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Seattle area Alt.Net" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/altnetseattle?hl=en.

Reply via email to