Continuous improvement is an essential aspect of agile development. It doesn't happen automatically. It involves each team, no matter how smoothly things are running, taking time every week to identify the most important area for improvement, discussing and making some change, and tracking what happens to see if the change helped or hurt.
In agile, the meeting or portion of a meeting to do this is called a retrospective (The Agile Samurai, section 10.5). Retrospectives can be hard to do well.
Agile retrospectives are a good way to test how well you, as an individual understand and can apply agile thinking (values, principles, common practices) to real team and product development problems. Agile thinking means
- detecting common threats to product and team development
- proposing changes to address those threats that are SMART and compatible with agile values.
Therefore, every week, every team member will submit a new retrospective, as described below. I will enter comments and scores before the meeting, if possible. These reports will be part of our coaching meetings.
Scores are from 0 (no entry) to 5 (really impressive insight). Expect terrible scores early on. Be happy if you get a 4, post on Facebook if you get a 5. The sum of your top 4 retrospective scores at the end of the quarter will be added to your team score.
You may think AngularJS coding or configuring push notifications is hard, but I have seen all students struggle far more with retrospective analysis than anything else. It's a form of critical reasoning that requires frank self-assessment and clear evidence-based analysis.
Retrospective Report Process
Every week the team submits an updated report, using the retrospective workbook described below.
An email with a link to the new sheet in the workbook is due the day before our weekly coaching meeting.
I try to review, comment on, and score your reports before we meet.
Setting Up Your Retrospective Workbook
Every team has a retrospective workbook on Google Drive. See Canvas for the link.
Put your names on the first row of the first page of the workbook. This will populate the column headings for the other pages in the workbook. This is the only editable section of the first page.
Each week, the team fills in a new page in the workbook. Each team member, individually fills in his or her column, answering the questions posed for each row.
- Pick just one problem and one process change to address it.
- Pick what you think is the most important issue change. It can be the same or different as another team member's.
- Write your own entries, even when agreeing with someone else. Do not copy and paste. Do not say "same as column D."
Do not delete or modify previous pages. I need those for review.
The Progress row is special. One entry here for the entire team.
When the report is ready, the team sends an email to my Northwestern address:
- Subject: 394 team name Retrospective
- CC: the team
- Include a link to the new page -- not the whole workbook or some previous page.
- You can add other comments in the email but they aren't necessary.
Due: 24 hours before our coaching meeting. Earlier is always fine. Just email when ready, but don't forget to email.
How to Do Well
The focus is on specific issues (metrics that matter), that have arisen for specific reasons, with a relevant, agile-friendly change to try, and specific ways to tell if the change has a positive net benefit.
Many process changes look good on paper but have very bad results, and are definitely not agile-friendly
- "Give penalty points to people who are more than five minutes late to the swarm." Blaming is not agile-friendly. It is divisive, i.e., hurts team morale. It doesn't address any underlying causality.
- "Document all code changes on a Google doc to reduce review time at meetings." Documentation is not agile's first choice. Why repeat work that tools like git already handle. Creating documents can take more more total time from the team than was spent in reviews at the meetings.
- Be specific. The most common critiques I give are about vague answers. Don't say there was buggy code, say what the bug was.
- Be agile. This is a test of how well you can adapt and apply agile ideas to specific situations.
- Be brief. Many answers can be just a phrase or two, such as common issues like "low bus factor", and common specific agile practices, like "daily standup". See the Agile Themes slides for some ideas.
- Many ideas are used by agile but not specific to agile. You'll find these when hunting agile sites for things other developers have tried. An example here is dot voting.
- Be careful! Not all agile ideas are specific. "Slice" for example is too vague to be operational. More detail is needed.
- Be careful! Agile is trendy. Many articles. are written by people who re-label old ideas as agile. Look at the discussion commentary, if the article has any, and search to see if the same idea is described on other agile sites.
- Write analyses backwards in time. "A was caused by B, which was caused by C, which was caused by ..."
- Keep a journal. During the week, jot down things you observe that seem relevant, e.g., an hour wasted in a swarm, a communication failure with the client. Then review and pick one to discuss for your retrospective.
An Example Analysis
People are late for meetings.
Attendance is the number one issue most teams raise in classes with team projects. And probably the most common fix I see teams propose is some kind of "shaming" for lateness, e.g., keeping a lateness board, making late-comers buy pizza, making late-comers where a "late" hat, etc.
The most important thing to realize is punctuality / attendance is not a problem in and of itself.
Not doing a process is not a problem. Problems are failures to achieve goals. A process per se should never be a goal.
Imagine a team whose members were always late to meetings, but that produced high quality code week after week and enjoyed doing it. What effect do you think those proposed process changes will have on such a team.
Further analysis is required.
- Extend the causal chain, in both directions.
- What real problem, such as low velocity, poor code quality, or unhappy client, might lateness to meetings lead to?
- What's causing the problem? Are the late-comers coming from farther away? Do they have a class or meeting just before the team meeting that they are not able to leave when it runs over? Are they coming in by mass transit?
- How bad is the problem really? What's the actual lost time aggregated? Is it randomly distributed among members and days, or does it vary by day and person? Was there one really bad day, an otherwise lateness is pretty minor?
- What actual level of harm is being done to a metric that matters? If you think it's causing coding tasks to be overlooked, measure that. How often is that happening, and what's realistic target level?
- If the above analysis validates the concern, propose a process change that fits the causal chain.
- "Be on time" is not a change. It's just a restatement of the goal.
- "Make late-comers pay $1 for every 5 minutes late" is a harmful change, if the cause of the problem is something the late-comer can't control. (In fact, it's hard to think of any situation where that's a good process change.)
- Measure the process change. Don't try for perfection. You will never achieve perfect on-time attendance. What's a reasonable goal that the team can agree on?
- Measure the costs. Any process will have some negative effects, e.g., more effort, more bookkeeping, possible team friction. What's an acceptable cost to pay, if the target benefit can be reached?
- Measure how the metrics that matter change. This is the only measurement that really matters. If attendance improves but coding tasks are still being missed, clearly attendance was certainly the sole cause, and perhaps not even relevant.