The customer was a bit skeptical about Scrum, but given the problems to date and the time pressures, he was willing to play the guinea pig.
During Sprint Zero, we got ready for our first Sprint. Scrum makes very clear what needs to be done before you can start Sprinting.
Scope, Quality, Cost, Time. These are the factors that any project manager has to have under control and these are defined very precisely for each sprint.
One one level, getting ready for Scrum means getting an initial definition of the project parameters. On the other level, getting ready for Scrum means educating the people involved (team and customer) so they know what is expected of the them, what they can expect from the other players, and how they can influence the process.
We had many feature lists, bug lists, wish lists and absolutely no idea what was really important. So this first step was to define the total scope. I collected all the feature lists, wish lists and bug lists I could find, and sent them to the customer with instructions to consolidate and prioritize the list on a scale of 1 to 3. The consolidated list had over 100 entries, and customer himself realized that the list of priority 1 items was so long that he had to introduce a Priority 0 to mark the items that were really important to him. This consolidated list became our product backlog.
During Sprint Zero, the team estimated all the Priority Zero and good chunk of the Priority One tasks (the twos and threes we left for a later date, which turned out to be never).
The team's first task was to take an hour or two and come up with a definition of done. Here's what they came up with:
- Tested (Unit Tests, Test Casses passed)
- Deployed on the Acceptance Test Server
- Documented (JavaDoc)
- Checked into Subversion
- Successful build
- Successful check-style
- Completed code review
- "Usable" for the User, approved by our usability guru
The team was also introduced to Scrum during this period, but that is also a topic for another post.
How long should a sprint last? 2 Weeks, 4 Weeks? Originally my thinking was true to the book, 1 month. But we had a serious trust problem with the customer and substantial pressure to get a release out by the end of the year. So the customer wanted to work closely with us.
Not knowing what a release really entails, but knowing that a release takes about 3 weeks (with only 7 weeks left in 2006), I suggested we do a two week sprint, with a dry run release to exercise the process. Then another two week Sprint to get all the urgent functionality finished for the December release.
This idea was accepted, but the pressure from the customer's customer to get out a new release with bug fixes and missing functionality was intense, so we ended up installing our dry run release as well. This proved to be an important learning experience.
In our case, cost is defined by the number of developers assigned to the project and the duration of the Sprint. Sum for your team (Availability * Duration * Daily Rate) = Cost. Actually it's a cost ceiling. Under Scrum, there's not much overtime, so there is an upper limit on the costs defined by the time.
The actual cost will be lower, because people provide support, get sick, take a day off, get called urgently into another project, whatever. (Real availability of the development staff for actual development work can be a very hot topic with the product owner).
Communicating the upper limit proved to be a very good trust building measure. Our actual costs were always a bit lower, so the customer was always pleasantly surprised by the actual invoice.
Once we had the team prepared, the product backlog in place, and agreement on the Sprint Duration, we were ready for our first planning meeting.