Bio Laurent Bossavit is mainly known as an Agilist and was a recipient of the 2006 Gordon Pask award from that community. He still likes to code though no longer doing so full-time, dividing his attentions between assisting Agile teams on technical and organizational matters, and the Institut Agile project which collects empirical evidence on the benefits and limitations of Agile practices.
Each year Agile Alliance brings together attendees, speakers, authors, luminaries, and industry analysts from around the world in a one-of-a-kind conference. The Conference is considered the premier Agile event of the year, and provides a week-long opportunity to engage in wide-open interaction, collaboration and sharing of ideas with peers and colleagues within the global Agile community.
Shane's full question: Good day, this is Shane Hastie with InfoQ, we are here at Agile 2013 with Laurent Bossavit, welcome. Thank you very much for taking the time to come and talk to us today. You are a board member of the Agile Alliance, a long time Agilista and you currently also have a part time role as the Archivist for the Agile Alliance. Could we start by talking a little bit about your engagement with the Alliance and what the role of the Archivist is?
I served in the board for about 4 years, we were hashing out Policy and Vision back then which actually in retrospect, I don’t think that role really suited me, back when I was decided to get involved I was more interested in actually doing stuff and since I’m a programmer, one of the things I thought I might be doing was writing code for the Agile Alliance and not doing any of that for four years but an opportunity came along after a while to actually help the organization with I think some very unglamorous housekeeping stuff like having a site where we could pull together the session materials and descriptions from all of the conferences. That didn’t exist back then, so we had problem I think still many conferences are having now where as the new conferences comes along and people get excited about that and forget about the previous years and the domain name goes down and maybe someone didn’t keep backups, some stuff that is precious to some people just vanishes basically, so I wanted to prevent that from happening so came up to Phil [Brock] one day and said: ”Would you like me to be the Agile Alliance Archivist?” and that is where it’s started.
Every session abstract and session types and author names, if you are interested in seeing what talks were given by Bob Martin or Andrea Freeze or me, you could just bring up the search box and have that list, so you could also search within session descriptions and get the slides. It’s very basic stuff but we didn’t have that.
One of the reasons for doing that was people I met, my day job is as a consultant and clients, practitioners, were always asking where can I find some kind of basic reference material, glossary and I need to know what this strange name practice is about in a very short space, I don’t have time to read the book and seems to me like was the Agile Alliance job to put that kind of thing out. So I spent some time working on that, the reason was the guide to Agile Practices, so it’s one thing, and now we are trying to see what kind of supplementary material would be a good complement to that, so we now have a blog where inviting some people to come in as guest bloggers and we are slowly building up that corpus of content.
Shane: One of the things that I record seeing on there is a very interesting “subway map”, tell us a little bit about that, what looks like a subway map.
All right, so to switch tracks because it’s actually related to the work I’m doing with the Agile Alliance but in very round-about ways, maybe will get to bringing up the relations, but that talk is a session that have been doing in various venues several times this year so that is kind of a change for me because so far I have liked to do different talks at different conferences and mixing things up a bit. This is a topic that I’m really passionate about, it came of my interest in the research of a fellow named Phillip Tetlock who wrote about how experts get things wrong, even people who are professional experts, people who are paid at making predictions. I find that research very interesting and his setup a research project to investigate that with normal people and people who are not paid to act as experts and to make prognostications basically.
So I enrolled into that program, I learned a lot of things, I’m not going to the details of the sessions but it’s about making predictions about future events. Therefore unknown uncertain events and that turns out to be, it’s a very interesting topic, it’s closely related to a number of things which are important in Software Development; chief among them estimation, estimation is about quantifying some kind of prediction about a future event - specifically how long it’s going to take to develop some piece of software. So that having long being a pain point for me in my practice of Software Development, I wanted to investigate that area. I ended up over a couple years spending many, many hours, countless hours, really invested in that investigation in finding out how you could improve at forecasting things within a fairly specific technical framework. So that took on the aspect of Deliberate Practice and I thought that having invested so much time in that, I might as well at least get a talk of it and so I started putting together some slides, some presentation and I thought it’s of interest to software people.
I was wondering if it was also, if could have broader applicability and I got some interesting opportunity to test that. The first time I actually did the session was at a Scrum Meeting in Paris, back in September last year, and it was one of those Indian summers, weather was still very nice and we had a meeting on a Saturday and we decided to have the meeting in the woods of the Fountain Bleu, and because the weather was nice, one of us said: “Can we bring our kids?” and so a bunch of us decided to come with our kids. Quite unexpectedly I ended that doing that session with an audience which was about ten grownups and ten kids including three of my own. If you think you’ve seen a tough audience, one that is stressful wait until you’ve had that kind of setup. I was completely unsure how they will take it so I took some time at the beginning, to say you won’t hurt my feelings if you walk away or go have fun somewhere but they actually ended up staying the whole hour; from 6 years old to 60 years old, so I had to improvise on motivating the topic of the talk so I started with: “Have you ever seen your parents fight?” and yes, a bunch of them raised their hands, and I addresses the parents. Has that happened? Sure, it happens, why does it happen? Because a lot of the time the culprit is Black and White thinking. If both people in a fight are typically convinced, absolutely convinced that they are each right and the other wrong. So that was the starting point and the session built on that and introduces the Art of Being Wrong.
No, it’s very much training in thinking in shades of grey.
6. Another thing that you being busy with likely is a book that you are putting together and publishing incrementally, on Leanpub, the Leprechauns of Software Development. I’ve read what is there so far and really enjoy it, but would you mind telling our listeners, our audience about that?
Yes, so to start with I love Leanpub, I think it’s a very cool concept, it’s kind of obvious in retrospect it’s one of those things and until someone comes up with it, it’s not so obvious. What I like about it is I’m able to have my book out there, people are buying it, downloading it, giving me feedback, of which goes into giving me new directions to think about. Meanwhile I have absolutely no deadline pressure, there are 160 pages out there, I’m writing more stuff but I’m putting that mostly on my blog and waiting for the right time to collate and organize the material and it’s really way better than the one time before had a book deal with a publisher and it were all those deadlines and I needed to estimate how long it will take me to write that chapter, I had no idea, it was very stressful, so this is much more my speed, but the book is out there and people can get it and there is a 100% money back guarantee if anyone is not happy with the state of the book.
So it’s a collection of things that basically debunk facts or beliefs that are very well established within the Software Engineering Community but witch turn out on closer examination not to be true. And I want to say I did not send out to be a debunker, didn’t wake some day and think: “There is all this misinformation or lies out there, I need to set the record straight”. The project came out of sense of curiosity, that’s how that relates back to the Archivist activity. I was trying to reconstruct history of ideas within the Agile Community, I wanted to give people some kind of accurate impression of how we came to adopt this or that idea, or this or that practice within the body of knowledge that the Agile Community is responsible for now, is a custodian for.
So I started investigating the history of Agile. As part of that I also started investigating the history of what came before so this is the history of the traditional modules, the waterfall module, the precursors to Agile things such as the Spiral Module, that is Barry Boehm’s or Tom Gilb’s EVO module which I think that came first. As I was investigating I came across things that I had long taking for granted and I went back to the source material, I wanted to be able to say well: “That is the exact thing that was saying by this person at that time that is how the idea came to like and that is how it was validated”. And it turns out in some cases including some very important beliefs what I called the Grand Truths of Software Engineering, the evidence just didn’t lineup with the claims.
So one example that I go on about for at some length is the claim that it costs exponentially more to fix a defect as you move through phases of the Software Lifecycle, that is the origin of formulation. Of course today we don’t believe so much in the Lifecycle anymore but we still have this idea kicking around that a bug cost more to fix and the cost rises exponentially as time moves on. I looked at the articles that are cited, the papers that are cited in the support of that claim and in many cases there are actually no numbers in the papers, so it’s very hard to say that the papers support the claim. Some of the papers are about something else entirely, I mean one of the papers is Fagan’s original paper in inspections, but it has almost nothing about the cost of fixing defects. One of the papers that has the most data, actual numbers, I took the initiative to take the numbers and put them into a Google spread sheet and make a graph, and instead of a nice smoothly rising exponential, what I found was this kind of a mountain curve and what you would actually take from that curve is, usually is much cheaper to fix a bug in maintenance than in the previous stages, which completely contradicts the original claim.
Some people when I write about that they can kind of shrug or I imagine them shrugging from the content of the comment saying: “How is that a big deal, people having publishing misleading statistics for ages”, I think that is actually kind of big deal because that one claim for instance is the underpinning of many of the fundamental principles in traditional for like of a better words Software Engineering. That is what you get for instance that you should focus on requirements, because if is very much cheaper to fix a defect in the requirements phase than later on, then obviously you want to invest a lot of effort in requirements. So as I was finding more and more of these things, I came to question more and more of those ground truths, even found quite a few Agile Leprechauns things the Agile Community beliefs to have been well established which in fact turn out not to be true. One of my major finds is the what I called “37 Billion Dollar Bluff”, there is supposed to have been a survey of US Defense Department Projects back in 1999 that found that 75% of those projects were failing and that has no basis in fact that I’ve been able to locate. There was a study back in the late 70’s, in 1979, which found the exact same percentages that are claimed for the ‘99 study and I have no formal background in statistics but I have consulted with statisticians and the numbers match so precisely that one of those sets of numbers has to be a copy of the other, a mimetic copy if you will, somebody took those numbers from one study and somehow people attributed that to later survey. I don’t know exactly what is going on there.
Shane: And that is being excepted as a fact
It’s been coated by Craig Larman for instance, I’ve discussed that with Craig so I’m not outing him with any malice, and he has admitted that at the time he didn’t actually check the original sources, he got fooled, many of us got fooled by many of those things, so in the book I also gone about things that I sincerely believed to be true at one point and on which I’ve changed my mind, and that is, part of that was the training that got over the past few years in The Art of Being Wrong. That is the tie in.
Shane: So being able to knowledge that these fundamental truths might not actually be so truthful.
Being able to question what you believe to be true, being able to change your mind and so one of my arguments in the sessions is that actually a key skill, it’s a life skill, it’s a professional skill, so it’s something that it’s well worth developing and the book was kind of a side effect of that practice.
Shane: Laurent thank you very much for taking the time to talk to us today, I will encourage people to go to Leanpub and have a look at the Leprechauns of Software Engineering, you certainly opened my eyes with that book and I appreciate it and look forward to hearing your talk.
Thank you for having me Shane!