Continuous improvement is an essential aspect of agile development. It doesn't happen automatically. It involves each team, no matter how smoothly things are running, taking time every week to identify the most important area for improvement, discussing and making some change, and tracking what happens to see if the change helped or hurt.
In agile, the meeting or portion of a meeting to do this is called a retrospective (The Agile Samurai, section 10.5). Retrospectives can be hard to do well.
Agile retrospectives are a good way to test how well you, as an individual understand and can apply agile thinking (values, principles, common practices) to real team and product development problems. Agile thinking means
- detecting common threats to product and team development
- proposing changes to address those threats that are SMART and compatible with agile values.
Therefore, every week, every team member will submit a new retrospective, as described below. I will enter comments and scores before the meeting, if possible. These reports will be part of our coaching meetings.
Scores are from 0 (no entry) to 5 (really impressive insight). Expect terrible scores early on, be happy if you get a 4, post on Facebook if you get a 5. Your top 4 retrospective scores at the end of the quarter will be added to your team score.
You may think AngularJS coding or configuring push notifications is hard, but I have seen all students struggle far more with retrospective analysis than anything else. It's a form of critical reasoning that requires frank self-assessment and clear evidence-based analysis.
Retrospective Report Process
There are two parts to the retrospective reports:
- Team-generated and maintained cause-effect diagrams (below)
- A retrospective report (below) with individually-generated discussions of a relevant process change to address a team development issue in that causal analysis
Team Cause-Effect Diagrams
Critical to useful retrospection and improvement is understanding what's going on. If your team keeps releasing buggy code, saying "be more careful" is useless.
- What the real problems are. A real problem causes visible damage to the project. It affects a metric that matters.
- Why this is happening. What are the causes and the causes of the causes? Especially focus on ones that are internal to the team. You can't change a client's schedule, but you can adapt to it.
- From the causal analysis, where can a simple change in process be made to avoid, reduce, or mitigate the problem?
There's a really nice description of how to do a good causal anaysis here (PDF).
Every team will create and maintain a team cause-effect diagrams in the Google Slides file in their respective Retrospectives folder on Google.
The team decides when to generate a new diagram slide. You should create a new diagram when you:
- Focus on a new primary problem. If one diagram was about low velocity, but now the focus is buggy code, make a new diagram.
- Want to redo an analysis, removing items referenced in previous retrospectives.
When making a new diagram, feel free copy from a previous one.
No individual diagrams. The team needs to agree on the causal analysis. If some people don't think X is causing Y, then evidence needs to be gathered and reported on. The team does not need to agree what problem to attack first, of what solution to try. That discussion can happen in coaching.
The Retrospective Spreadsheet
To structure your reports, we use the 394 Retrospective Workbook.
Every team has their own workbook in their Retrospectives folder on Google. Each workbook has a sheet for each week's report. Each sheet has a column for each member.
Once you choose a column, use the same one every week.
The focus is on "given this causal analysis, here's a simple agile-friendly relevant change to try."
In the causal diagram row, put in a clickable link to the specific slide with the diagram you're talking about.
In the change row, specify which of the changes in the diagram you're talking about.
In the issue row, specify which of the metrics that matter you're focusing on. Don't repeat the causal analysis in the spreadsheet. Focus on a proposed change, e.g., "move the meeting", what that change might affect and why, how you would measure, and why the change is "agile-friendly."
Many process changes look good on paper but have very bad results, and are definitely not agile-friendly
- "Give penalty points to people who are more than five minutes late to the swarm." Blaming is not agile-friendly. It is divisive, i.e., hurts team morale. It doesn't address any underlying causality.
- "Document all code changes on a Google doc to reduce review time at meetings." Documentation is not agile's first choice. Redundant work that tools like git already collect is especially not agile-friendly. Creating documents can take more more total time from the team than was spent in reviews at the meetings.
Filling in the retrospective spreadsheet
- Use the tabs in the spreadsheet provided in each team's retrospectives folder.
- Each team member, individually fills in his or her column.
- Put in a link to the cause-effect diagram you're talking about.
- Identify the process change in that diagram you're talking about. It can be the same or different as another team member's, but every entry should be in your own words.
- Pick just one change.
- Do not delete or modify previous worksheets. I need those for review.
- The team sends an email to my Northwestern address:
- Subject: 394 team name Retrospective
- CC: the team
- Include a link to the new worksheet. Use the link for the current report, not for the whole spreadsheet or some previous report.
- You can add other comments in the email but they aren't necessary.
- The deadline for the email is 24 hours before our weekly meeting time.
A few ground rules:
- Use your own words. No copying and pasting from anyone else, or yourself
in a previous report.
- Exception: the first two weeks can be done by the team or subgroups in the team. If team members A, B, and C have a group answer, put it in the column for A, and put "same as A" in the columns for B and C.
- You can focus on the same issue or proposed change as someone else, or yourself in a previous report, but there should be substantively new insights in each new retrospective.
- Be specific. The most common critiques I give are about vague answers. Don't say there was buggy code, say what the bug was.
- Be agile. This is a test of how well you can adapt and apply agile ideas to specific situations.
- Be brief. If you go past 3 sentences in a box, you've written too much.
- Keep a journal. During the week, jot down things you observe that seem relevant, e.g., an hour wasted in a swarm, a communication failure with the client. Then review and pick one to discuss for your retrospective.
An Example Analysis
People are late for meetings.
Attendance is the number one issue most teams raise in classes with team projects. And probably the most common fix I see teams propose is some kind of "shaming" for lateness, e.g., keeping a lateness board, making late-comers buy pizza, making late-comers where a "late" hat, etc.
The most important thing to realize is punctuality / attendance is not a problem in and of itself.
Not doing a process is not a problem. Problems are failures to achieve goals. A process per se should never be a goal.
Imagine a team whose members were always late to meetings, but that produced high quality code week after week and enjoyed doing it. What effect do you think those proposed process changes will have on such a team.
Further analysis is required.
- Extend the causal chain, in both directions.
- What real problem, such as low velocity, poor code quality, or unhappy client, might lateness to meetings lead to?
- What's causing the problem? Are the late-comers coming from farther away? Do they have a class or meeting just before the team meeting that they are not able to leave when it runs over? Are they coming in by mass transit?
- How bad is the problem really? What's the actual lost time aggregated? Is it randomly distributed among members and days, or does it vary by day and person? Was there one really bad day, an otherwise lateness is pretty minor?
- What actual level of harm is being done to a metric that matters? If you think it's causing coding tasks to be overlooked, measure that. How often is that happening, and what's realistic target level?
- If the above analysis validates the concern, propose a process change that fits the causal chain.
- "Be on time" is not a change. It's just a restatement of the goal.
- "Make late-comers pay $1 for every 5 minutes late" is a harmful change, if the cause of the problem is something the late-comer can't control. (In fact, it's hard to think of any situation where that's a good process change.)
- Measure the process change. Don't try for perfection. You will never achieve perfect on-time attendance. What's a reasonable goal that the team can agree on?
- Measure the costs. Any process will have some negative effects, e.g., more effort, more bookkeeping, possible team friction. What's an acceptable cost to pay, if the target benefit can be reached?
- Measure how the metrics that matter change. This is the only measurement that really matters. If attendance improves but coding tasks are still being missed, clearly attendance was certainly the sole cause, and perhaps not even relevant.