InfoQ Homepage Presentations Metrics in an Agile World
Metrics in an Agile World
Summary
James Shore and Rob Myers help you examine the role of metrics on Agile teams. We take a broad survey of metrics being used on Agile projects, both traditional and innovative, and look at the value and dangers to the success of the team. We look at how the simple act of measuring, itself, can be harmful, and when it is well-justified.
Bio
Rob Myers is lead instructor and co-founder of Agile Institute. Rob has been training and coaching teams in Agile practices and object-oriented programming since 1999. James Shore is an XP/Agile consultant and practitioner who has been leading agile teams in success and failure since 1999.
About the conference
Agile 2009 is an exciting international industry conference that presents the latest techniques, technologies, attitudes and first-hand experience, from both a management and development perspective, for successful Agile software development.
Community comments
Re: Metrics in an Agile World
by Ameer Hussein Gaafar,
Re: Metrics in an Agile World
by Rob Myers,
Re: Metrics in an Agile World
by Ameer Hussein Gaafar,
Your message is awaiting moderation. Thank you for participating in the discussion.
If there was a point I didn’t get it! The evil side of metrics is neither new nor specific to software development. For instance, measuring students’ performance through exams can make some of them study towards assessment not learning. Does that mean schools should stop trying to measure student’s performance?
The title led me to believe that the presentation is about the metrics that fit well in agile mindset and how to use them. Did I expect too much?
Re: Metrics in an Agile World
by Rob Myers,
Your message is awaiting moderation. Thank you for participating in the discussion.
You did not expect too much, and I think it's a fair criticism that we didn't spend as much time on the possible solutions to the metrics dysfunctions as we would have liked. Jim summarized the approach nicely, but we would have enjoyed examining many of the metrics that the audience brought up, and looking for ways to find alternatives, or to use the "measure up" approach to mitigate dysfunction.
Moving performance metrics up to the level where individuals (or the teams measured) cannot directly alter the measure is the basic approach. Anonymous reporting, and aggregating the values, prevents that dysfunction, and turns a motivational metric into informational. It can still be used to measure performance, but it's at a higher level, and resembles something that we would really prefer to optimize, rather than setting up an opportunity for local optimization (gaming the system).
We're still exploring these techniques, and I'm hoping to shift the balance of the talk from so many examples of dysfunction into examples of practical techniques to remove the dysfunction.