Sunday, December 12, 2010

"Where does the QA fit into Scrum" - common suboptimal implementation of Scrum

Here is quite common and very suboptimal implementation of scrum ("Where does the QA fit into Agile"). I have seen this multiple times in the context of multivendor projects where "testing" is externalized to one company and development for yet another company.

I commented the post in the following way:

Sorry to say but this is common suboptimal "implementation of scrum", some refer it "scrumfall" and some think that it is not scrum at all. You simply get feedback "an iteration too late" and your software is never done when "exiting the development phase". Smell of this implementation is usually definition of done saying "ready to be tested".

Anyway, here is how to make it better:
  • Form a cross-funtional team with developers AND testers in it - their definition of done is "releasable product/project increment" that is tested
  • There might not be sense to have analysis as "iteration", analysis is continuous flow

How does the tester then participate in the development sprint to get the items you highlighted done:
  • New feature testing
    • In sprint planning the testers will help the team to figure out the acceptance criteria for the incoming feature. They might even express that as executable acceptance test (see: or they might do that within the sprint.
    • During the sprint testers are doing exploratory testing to find the corner cases of the feature and "how it fits to the whole".
    • In the demonstration testers help the product owner to see that the acceptance criterias pass and they are there also to make sure that if something is not done then it won't be demoed
  • Feature integration testing
    • Integration in the large: testers may work on emulating the external systems or testing these cases (by automating the regression suite)
  • Regression testing
    • This is the "product" that testers will do step by step by automating the acceptance tests
It might sound that testers need to be developers. The testing is collaborative activity taken by the "team". If testers need something to be developed then the developers will develop that. Pairing helps a lot in here. Sometimes testers even help the developers to do unit testing because a tester has different point of view which is usually very beneficial.

Many teams (including ours) have successfully managed to build security and performance testing suite that can be executed part of the continuous integration/deployment and have also continuous monitoring in place (to improve these suites). Still there is testing (usability comes into my mind) that might be very hard to fit into sprints .. even by taking 1 small piece in the time.

Edit: Ron Jeffries has also written reply on his blog.

No comments: