Skip to main content

A failed Sprint Review

Do your sprints look anything like this:
  • In Sprint Planning 1, the team commits to implementing 8 stories and fixing 2 bugs (open issues identified in the previous Sprint Review).
  • During the week -- our sprints last one week -- from the Product Owner's perspective, everything seems very quiet. The virtual task board is showing no movement. No builds are forthcoming. No calls or questions either.
  • Friday before the Sprint Review there is some pickup of activity. 
  • Monday, 1 hour before the Sprint Review, a build becomes available. Theoretically, everything is done. You download the build. Start to test the functions. Crash. Try again. Another crash. Hmm. What can I test without causing a crash?
  • In the Sprint Review - we go through the 10 stories and bugs. The Review lasted 2 1/2 hours for a one week sprint.  The bugs had been fixed. 2 of the 6 stories were implemented and are done. The remaining 4 stories were not done and had to go back on the backlog (and yes, I did put them into the next sprint). The new crashes also came into the backlog.
This had been our pattern for the last several weeks. As Product Owner for this project, I found this very frustrating. Particularly because I had been asking for continuous delivery for several sprints, but the team still wasn't doing it. As a Scrum Trainer, I found it very embarrassing to be having problems with getting things done. My team was embarrassed too. They are a good team. We're better than this. How did we get into this trap? More importantly how do we get out?

Step 1 is listening. (see From a Blame Culture to Fearless Trust) I believe that when people really listen to each other, great things can happen. My personal challenge was putting away my frustration so we could listen and discuss effectively. Juliano and I sat down on Skype and Hangout and talked and listened to each other. We identified several things:
  1. I was concerned that I was doing too much of the thinking about how to solve the problems.
  2. Juliano felt the giving the team responsibility for creating the the how-to-demo section had been a good thing, as it encouraged them to really think about the issues.
  3. Our definition of Done has included continuous integration with a Jenkins build since the beginning of our project, but for many reasons, we had not actually implemented it. So creating and publishing a build was a lot of work. (see Sample Definition of Done )
  4. The team was doing overtime every sprint.
  5. The team felt pressure to deliver as many stories as possible, so they were committing to the maximum they could. As a result, they had no time to work on quality issues, like setting up build server or responding to feedback if it arrived during the sprint.
  6. The definition of Done was being applied at the Sprint level, not to each individual story. So all the stories were getting delivered shortly before the end of the sprint without any time for feedback.  (see It's Done! (or is it?) ) 
Particularly these last two items were causing the long Sprint Reviews. After discussing these points, we agreed to emphasize quality over quantity. So now we had a shared commitment that getting some things really done is more important than getting everything sort-of "done."

As a result, we put one new item in the Product Backlog.
"As Product Owner, I would like a new build when every feature is completed, so I can confirm that stories are really done as soon as possible."
We prioritized this as number 1 for the next sprint, even higher than unfinished work from the previous sprint. The team focused on getting the build server working, then on getting stories done one after the other, and in the order prioritized. They also reviewed the definition of Done, and how they could better ensure that things were really done.

As I write this, tomorrow is the Sprint review, so this is a work in progress. Some changes are already visible:

  1. The features have been coming one by one, which makes it easier to review them and give feedback. The team delivered three releases, probably another one will come before the review. So we have quadrupled our output of potentially releasable software.
  2. We discover problems quicker on the way to Done - "hey, this story has no 'how-to-demo' defined. How do we know if it works?" or "This doesn't install on my iPad 4."
  3. We are thinking much harder about how to ensure that stories are really Done. We use TargetProcess to manage our work, so we are exploring how to use the test suite and task creation plugins to automate the routine parts of identifying what has to happen to get a story to done.
  4. The quality of the delivered stories has gotten much better. It seems taking off the pressure to deliver quantity enables the team to do better.
Is really Done functionality being produced on every story and on every release? Has the team gotten faster? Is this approach sustainable? Let's say, we still need to do retrospectives, there is still room for improvement, and for some questions, only time will tell. But as Product Owner, I am really pleased with what have I seen up til now, and look forward to the Sprint Review with a smile.


Comments

Popular posts from this blog

Sample Definition of Done

Why does Scrum have a Definition of Done? Simple, everyone involved in the project needs to know and understand what Done means. Furthermore, Done should be really done, as in, 'there is nothing stopping us from earning value with this function, except maybe the go-ahead from the Product Owner. Consider the alternative:
Project Manager: Is this function done?
Developer: Yes
Project Manager: So we can ship it?
Developer: Well, No. It needs to be tested, and I need to write some documentation, but the code works, really. I tested it... (pause) ...on my machine. What's wrong with this exchange? To the developer and to the project manager, "done" means something rather different. To the developer in this case, done means: "I don't have to work on this piece of code any more (unless the tester tells me something is wrong)." The project leader is looking for a statement that the code is ready to ship.

At its most basic level, a definition of Done creates a sh…

Scaling Scrum: SAFe, DAD, or LeSS?

Participants in last week's Scrum MasterClass wanted to evaluate approaches to scaling Scrum and Agile for their large enterprise. So I set out to review the available frameworks. Which one is best for your situation?

Recently a number of approaches have started gaining attention, including the Scaled Agile Framework ("SAFe") by Dean Leffingwell, Disciplined Agile Development (DAD), by Scott Ambler, and Large Scale Scrum (LeSS), by Craig Larman and Bas Vodde. (Follow the links for white papers or overviews of each approach).

How to compare these approaches? My starting point is Scrum in the team. Scrum has proven very effective at helping teams perform, even though it does not directly address the issues surrounding larger organizations and teams. An approach to scaling Scrum should not be inconsistent with Scrum itself.

Scrum implements a small number of principles and constraints: Inspect and Adapt. An interdisciplinary Team solves the problem. Deliver something of va…

Five Simple Questions To Determine If You Have the Agile Mindset

My company has started a top-down transition to Scrum and Kanban. Will that make us an Agile company? About 2 years ago, I attended a conference hosted by the Swiss Association for Quality on the topic of Agility. As a warm-up exercise, the participants were given the 4 values of the Agile Manifesto, then asked to arrange themselves in space. How Agile is your company? How Agile do you think it should be? Very Agile on left, very traditional on the right. There was a cluster of people standing well to the right of center. “Why are you standing on the right?” It turns out that they were all from the railway. “Our job is to run the trains on time.” They were uncertain whether this agility thing was really aligned with their purpose.
Is Agility limited to software? Steve Denning has collected the evidence and laid out the case that Agile is not limited to software, nor is it merely a process, nor is it something you can do with part of your time, nor is it something you can have your …