BT
x Your opinion matters! Please fill in the InfoQ Survey about your reading habits!

Laurie Williams: Getting to Comparative Agility
Recorded at:

Interview with Laurie Williams by Shane Hastie on Dec 22, 2010 |
16:04

Bio Laurie Williams is an Associate Professor in the Computer Science Department at NC State University. Her research focuses on software security particularly in relation to healthcare IT; agile software development practices and processes; software reliability, software testing and analysis; open source software development; and broadening participation and increasing retention in computer science.

The Agile 2010 conference is created by a production team of highly respected Agile experts and practitioners to present a program that spans the whole spectrum of agile practice.The Agile conference series is a organized as a program of the Agile Alliance, a non-profit organization dedicated to uncovering better ways of developing software inspired by the values and principles of the Manifesto for Agile Software Development.

   

1. Laurie, would you mind very briefly introducing yourself for our viewers?

I’m Laurie Williams. My current position is Associate Professor of Computer Science at North Carolina State University. I also currently do Agile coaching and training, particularly in the Research Triangle Park area of North Carolina. We have a lot of companies there and I work with companies to do coaching and training, which helps me also be a better researcher and teacher, I believe. Before I worked at academia, I worked at IBM, starting out as an engineer and eventually going into software development.

   

2. Quite a while back, in terms of the Agile world, you did a lot of the early research into pair programming.

I worked at IBM and then I got my PhD and my PhD dissertation research involved pair programming. I got started in that in 1998, which is really when Extreme Programming was emerging and so pair programming emerged as a practice of Extreme Programming and people immediately had a bias that pair programming must take twice as much effort. We ran an extensive 16-week studying with students on pair programming and what we found out from that study was that pair programmers do take a little bit extra time, maybe 10% longer total time. so that would mean that if someone worked alone and might spend 10 hours on a task a pair might spend a little more than 5 hours, so it takes a little bit longer, but they get higher quality codes. That was the net of that study.

   

3. And that ended up in a book?

Yes, it did. I ended up writing a book called Pair Programming Illuminated, which embodied that research as well as other experiences with pair programmers.

   

4. Recently, you’ve been doing quite a lot of research into the relevance of the principles of the Agile Manifesto. Could you give us a little bit of background on that? What’s happened there?

The Agile Manifesto and the principles behind the Manifesto were authored in February 2001, so we’re coming up on 10 years. What I wanted to see is what do people think about the principles now that 10 years have gone by, how relevant are they based upon what’s happened. I ran an extensive survey on Survey Monkey, I had 335 responses from all over the world. Primarily I’d say Europe and the US, but there were representatives all over the world that answered that survey and a variety of domains and sizes of company. I asked two sets of questions: one was specifically looking at Agile principles that were authored in 2001 and how important do they think they are.

Also I had a list of all of the Agile practices that I could think of and I asked if they used those practices. The net of the survey was really an assessment of the importance of the principles and the practices.

   

5. Some interesting results or things you can share with us about what were people saying?

All in all I’d say that there was a lot of support for the principles, so they did a good job 10 years ago of embodying what the most important things about Agile were. On a scale of 1 to 5, I think the least supported principles had 3.8, so still pretty well supported. The principles that turned out to be the most important were those that involved delivering software to customers frequently in order to get feedback on them. There were two principles that involved that. One is "Our highest priority is to satisfy the customer through early and continuous delivery of valuable software." That was highly ranked. Tied with that was "Deliver working software frequently from a couple of weeks to a couple of months with the preference for the shorter time scale." Those were the two that were the highest.

Then there was one tied for last place which was "The best architecture requirements and designs emerge from self-organizing teams." That was the least supported. Just to highlight the feedback we got on the principles, some people felt that the principles were biased towards developers. A number of the principles they used the word "developer," but following up with the original authors they meant to save the whole team, not just the developers - developers, testers, UI, database. So there was some pushback on the principles because of that. There is a little bit of emergence of the concept of Kanban in Agile.

Kanban would say rather than have strict iteration deadlines, where all the things were due at a certain date, instead just always reach back and pull something into the current work. There is a maximum number of things that can be worked on. Because, Kanban doesn’t have the notion of iteration, there was a little bit of lack of support for some of the principles because they did co-notate iterations. There was a need for more vision before the product would begin, people gave feedback on that. There needs to be more of a release plan vision rather than just an emerging architecture and requirements.

The principles don’t’ really say you should have in vision, so people mentioned that as well. There is not as much mention about high-quality software as some people wanted. They were thinking that you could just continue to develop and develop and it could be junk, but as long as you are delivering it constantly, you were fulfilling the principles. They wanted more of a quality statement, something like potentially shippable products, all the time and that’s the main thing. Some other things are the principles mentioned face-to-face and currently there is a lot distributed teams so face-to-face is a little bit impractical.

There is more focus on synchronous communication where people are actually talking on the phone or are doing instant messaging where they can converse back and forth, even if not face-to-face. There is a principle about maintaining a pace indefinitely and if you just have iteration after iteration, even though realistic but constant pressure all the time that people can get burnt out. So maybe there is a need for some downtime every now and then. I guess that are the highlights of that.

   

6. On the practices side anything?

Some of the higher ranking practices... Continuous integration was one of the highest, so developers are constantly putting their work in code into the codebase. Short iterations was a high practice. The notion of "done" criteria was actually at the top. "Done" criteria says "In order for a feature to be considered done, what do we need to do?" We need to pass acceptance test, we need to have automated unit test, we need to have testing complete, we need to fix defects. What does the team consider the feature to need to be able to demonstrate to be considered done? That was a surprising practice to be at the top of the list, because that wasn’t.

Ten years ago no one had that notion. But I think that through the years the idea that if you just keep putting out feature after feature without considering the quality of it, you could be in trouble. I think that once that it emerged that you need to have good "done" criteria, it shot right to the top of the list. That was high, automated testing was high, iteration reviews and demonstrations was high, having potentially shippable features which goes along with the "done" criteria, whole team, synchronous communication, having visible customer stories was high.

Some of the lower ones: pair programming (it breaks my heart), burn down charts, code and design inspections which are not necessarily an Agile practice, so it’s not that surprising. The use of Planning Poker for estimation was low - that’s another emerging practice that wasn’t around 10 years ago; teams who do usually like it but I don’t think it’s really out there as a practice, not a lot of awareness about it.

Stabilization iterations was low, so stabilization iterations would say from time to time just have an iteration that would focus on things like performance and integration and stress testing and clean up your defect backlog. That’s an emerging practice but that was low. And then finally Kanban as I mentioned before was low, but that’s an emerging practice.

   

7. Looking back, any trends that you can see emerging out of that?

I think I see a trend towards caring about quality more. Scrum was the most popular methodology for a number of years, and you can be a Scrum team without caring about product quality, you cannot do automated testing. I think that teams who follow just plain Scrum are not realizing they need to get some engineering practices in, so they can have the product quality. That’s a trend and then Kanban seems to be an emerging trend. I’m going to go to a lot of Kanban sessions in this conference to find out what people are saying. My first degree is in industrial engineering, so from an intellectual standpoint Kanban and software don’t seem to go. So I’m interested here of what people have to say.

   

8. At the conference you’re talking on comparative Agility. Isn’t it all or nothing?

No, there are no two Agile teams that are the same. Based upon what the project is, what the team make-up is, how safety critical, what are the "ilities" about the project, we determine what Agile practices are best for the team. So, as I said, each team is different. What Comparative Agility is, is an assessment tool. A team or an individual can go online and answer questions. The current version of the survey is 125 questions, so there are quite a few. We’re in the process of revising it and we’ll have many less questions, but you would go in and answer what your practices are.

At the end, you immediately get a bar chart that would tell you how are you doing versus the industry, so you could find out "We’re not doing test driven development everyone is, so we’re down on the scale on that." Or you may find out no one else is doing this either. So, it gives you some information about what everyone is doing.

   

9. I suspect then it’s a tool that gets better with the more people that use it.

The more people that answer it, the better feedback that you get. One of the reasons we did the survey that I talked about earlier was because that comparative Agility survey didn’t necessarily align with what the community says Agile is. I’m working with Mike Cohn and Kenny Rubin to revise Comparative Agility to be in line with the current trends for Agile.

   

10. This survey is freely available - the comparative Agility people can download it? We’ll make sure that the link is included in the takes to the company to this interview.

It’s the www.comparativeagility.com survey and the new survey is at http://feedback.comparativeagility.com and that’s where you can take a look at the revised questions in the short term. In the long term that will go away and the revised questions will be on www.comparativeagility.com and it’s just a web link, you go in and you start answering questions and at the end of answering questions you immediately get feedback on your answers. Teams who want to take the survey together and to collect the responses can contact Kenny Rubin who will create a customized collector for them, so they can look at "Our team answered this way." They can assess what should their team be doing, so that they can create some process improvement plans based upon that.

   

11. Sounds like a good tool. What’s next? Where are you heading in terms of your research and your work that you are starting to do or have been doing a lot more coaching out there in the wild, so to speak? Where are you heading next?

In a number of areas. As I mentioned, doing the coaching and the training makes me listen to what are the things the people are skeptical about, what kinds of problems are they having and then I think about what are the research results related to that. If there aren’t any, that’s a ripe opportunity. Some of the things that I noticed that I’d like to research, one is people’s resistance to having a very thorough requirements document. They think that they’re never going to get those details out unless they have that kind of document.

When I compare the level of detail of a traditional requirements document to stories plus acceptance tests, the things that emerge from Agile, hopefully do let people realize that the details do come out. They just come out later, so they can let go of that requirements document. Also iteration zero, having an iteration before the product actually starts in order to do some architecture, some database design, some UI development, just to get started, that’s not really out in the literature and what the benefits of that are. That’s something as well. Planning Poker is a collaborative estimation process and there is not a lot of research that would show, that that’s a good thing to do.

My experience with Planning Poker is it is an estimation process, but yet the estimate is almost the minor detail of the benefit of the meeting. If you have a Planning Poker meeting where collaboratively the team comes up at the estimate, in order to come up with the estimate the team has to discuss the requirement to such level of detail that really the benefit is what do you really want and aligning that with the estimate. I think that’s a really good technique, but there is not a lot out there about that.

   

12. Sounds like there are some interesting opportunities. Just to wrap up - what are you looking to get out of this conference?

Like everyone else, to see what are the latest things. Every year when I come, things have emerged, people have tried different practices, so just to get an idea of what’s to come.

General Feedback
Bugs
Advertising
Editorial
InfoQ.com and all content copyright © 2006-2014 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
Privacy policy
BT