BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Evidence Based Management with scrum.org

Evidence Based Management with scrum.org

In this podcast Shane Hastie, Lead Editor for Culture & Methods, spoke to Patricia Kong of Scrum.org about Evidence Based Management.

Key Takeaways

  • Collect empirical evidence about the current state before adopting any new way of working or framework
  • Organizational transformation implies an end state, which doesn’t actually exist – rather take a continuous improvement approach 
  • The four key value areas are current value, unrealized value, time to market and ability to innovate and each has a different set of metrics that should be tracked
  • Shifting from project to product thinking is a step in the right direction
  • Technical leaders need to set useful goals and ensure they align with business outcomes rather than focusing on outputs and activities

Transcript

Shane Hastie: Good day folks. This is Shane Hastie, the InfoQ Engineering Culture podcast. Today, I have the privilege of sitting down across many miles with Patricia Kong. Patricia, welcome. Thanks for taking the time to talk to us today.

Introductions [00:51]

Patricia Kong: Thank you for having me Shane across many, many, many, many miles. I'd rather be in-person.

Shane Hastie: Me too. It would've been great to be able to get together in-person. You are the product owner for Enterprise Agility at Scrum.org. First of all, I suppose, tell us a little bit about you and what brought you to that space and then what does product owner for Enterprise Agility do?

Patricia Kong: Oh, that's a good question. That might be the hardest question you're going to ask me throughout this podcast, isn't it? So backing into where I am now, what's interesting is I studied in business school, organizational behavior. And so I've always had this interest in how people work. I don't know if I like people that much, but just how they work are very interesting to me. And what that found me into... This is a little bit dodgy, but I was really persuaded to go into a corporate financial career, right? That's typical Asian American going into finance. That's what I did, very corporate. And then that found me into different areas. Once I had enough of that, I was going through research, really coming into the marketing strategy IT space. And I actually ended up moving to France where I was working in startups and looking at product management and product development.

And like most people, my back's against the wall. And then we go, "What's this agile stuff?" Well, we need an audit on all of the code of our product. How do we do that effectively? We were working with different teams around the world. And so I was doing that, doing a lot of different things and then found myself coming back to the Boston area where I'm based and connected up with Ken Schwaber at that time who owns Scrum.org. And who's the co-creator of Scrum. And at that time, the company's not big, but the company was much smaller in the scale. And so we were looking at things around just streamlining the business, but also thinking about professionalism, learning. And then he was working on something called the continuous improvement framework at that time. And so this is probably about 10 years back.

A continuous improvement framework [02:46]

Patricia Kong: And it almost seemed when we were talking about it with the industry, that it was a little bit too early. And so he was trying different potential ways to think about this. How could we take something like Scrum and work it at the enterprise level, so that we could work on continuous improvement together. And to be honest, the only thing that landed solidly from that because there was a lot of just taking different attempts, looking at different methodologies, were two things. It was this notion that you should inspect and adapt at the organization. So what evidence would you use to manage that? And then other thing was the Nexus framework. So Nexus is the scaling Scrum framework.

And the reason that that came up was because of the whole rush for companies to get to scaling. And he said, "Wait a second, if you want to scale, let's talk about just a few teams work together. What would that look like? What does that core that we have to address there?" And then the other one is to say, "Hey, show me the evidence that agile is actually working for you. And so that you know how to progress." So that's a really long story, but I think in a nutshell, that's how I am, where I am now. And so I think about those things, those boundaries. You're using agile, you're doing things. What does it mean when you hit that boundary and it's not working anymore because of the enterprise?

Hitting the organisational boundaries [03:59]

Shane Hastie: This is something, this concept of Scrum or one of these approaches hitting the organizational boundaries, is something we see and hear about a lot. If I think of the audience who are listening to this, probably a number of them will have hit that boundary and bounced. What's some advice in terms of, we want to be able to bring in some of the empirical approaches that agile methods and Scrum in particular talk about,  and yet we're hitting whether it's the annual budgeting process or the HR recruitment process. How do I influence those changes?

Patricia Kong: I think about this one from, I think a soft skill standpoint. There's this notion that the way that we've seen this rushed to agile, rushed to Scrum, rushed to scaling, rushed to that. And the question of why do we even need to be agile, gets lost? Why are we trying to pursue that? What was it in the first place? And when it's really pointed to, and it's fine, it actually makes sense. When an executive says, "That's really working well." And says the whole organization should do it. Let's get the training. Let's get it on the training schedule. There's a lot that's lost in understanding when you set up all of that training on all that stuff. When we come back and say, "Was it worth it? Was my money worth it?" We have nothing to show. We might just, so we have 20 teams, we have a hundred teams.

This is organized this way. And so there's no real evidence. And there's two things here: is that when the leadership says we're going to do something like an agile transformation or there's an interest from the people that you're listening to. The question that I would have back is which way did you bounce? Did you profit? Did you gain power from this kind of transformation and this is looking good for you? Or is this something that you've really struggled for and you are championing? Because there's two things that happen here, which is there's a lot of times when change happens and leadership see themselves above the change, the change is not being done to them. And so that requires coaching and a lot of serious conversations. And then the other thing is that we invest in things that, sadly, we just have to say, we're going to say that this works.

Using empirical evidence to guide decision making [06:07]

Patricia Kong: And if not in four or five years, I'm out anyways, we're rotating door. So it's really which way are you bouncing? What we've been working on this notion of evidence based management is saying, "I don't want you to even come in and say, Scrum is the way. And just bring Scrum all to the teams." That doesn't work too, right? You're going to have a waterfall team and a Scrum team, and everybody's going to get very confused. We think about the dynamics of organizational practices, product practices, technical practices. So what I want to know first is I would look at something like EBM, where we've said, "Let's get into a outcome based way of thinking," right? So really talking about customers, why are we doing these things so that we can get more lean about the work that we pursue? And then just say, when we think about value, Scrum.org has defined that in four key value areas, unrealized value, current value, time to market and ability to innovate.

Four key value areas and associated metrics [06:56]

Patricia Kong: Unrealized value, current value, it's really looking at the marketplace saying, "What's that future potential value that's out there, which is unrealized value, right?" So what are those satisfaction gaps? And the current value is where are we now? And then ability to innovate and time to market. Can we do it? And how long would it take us to learn from something? What we find from organizations that they're just focused in one area. So let's have a conversation about that. That's where I would encourage the people who are listening to think about where are they really focused and is that at risk of them ignoring some other stuff that could help them make better decisions?

Shane Hastie: You touched on a really important factor around metrics. It's really hard to identify the value. And personally, I really struggle with the idea of organizational transformation that implies an end state. If we take a Kaizen continuous improvement approach, I would hope that organizations are constantly looking to get better and better. And when we take that approach, what are the metrics? What are the measures we should be looking at around improving as an organization and how do they relate to me as a technical influencer, or a technical team leader?

Patricia Kong: And that's the million dollar question where people try to nail me down and say, "Patricia, tell me exactly what to measure." So what we think about, so the way that I would want to talk about this from a organizational perspective is I agree with you 100%, right? Not only does a transformation to note, there's a beginning and end state. It makes it look like there was no journey, right? It was really quick. It's like a before and after picture of a person who went on a very long diet or something, right? We don't see the struggle in between and the consistency and the hard work that they have to do to maintain it. When I think about the measures, so each one of these key value areas, we suggest different metrics called key value measurements in there. And there's going to be ones that are quite obvious, especially from the technical perspective.

So from a software perspective and ability to innovate, we're going to obviously look at things like technical debt, that's holding you back. We're going to be thinking about installed version index and make decisions, hopefully by looking at, "Hey, how many of your users are actually on your current version?" We're going to look at time to market. People should know their lead time and cycle time and those different things. But what we want people to think about is having an understanding of your flow measures or different measures there, is good if you understand, and can communicate to your organization, the value that it is in pursuit of. So that's one way. I would use these different key value areas to think about a lens of where we can improve. So for an organization where they say we've released one time a year and you go, is that good?

That's about context because maybe before it was three years, right? And now if they're saying, hey, four times a year, that's great. In what context, what value is that in pursuit of? So that's where we really focus on the holistic thing. And a lot of the times when I'm talking to my colleagues or associates, there's this still this surprise of, "Patricia, we go into organizations and it kills me to see that software teams, the developers are still being gaoled on exactly what I was gaoled on, which was velocity to deem that as progress." And so the other conversation here is to not only look at the key value areas, but really understand and have it communicated to think about agility perspective. What is that outcome focus, that customer focus goal that we're trying to get to? And how would you actually know if that was successful?

So the whole thing about inspection and adaptation, what we don't see and why I feel so passionate about EBM is because the adaptation isn't there. And I might even say the adaptation isn't there is because a lot of the times we don't say, how will we know when this goal is met and what would we do to adapt from it? And so we break down goals thinking usually about them from a time perspective and looking at not only what capability you have as an organization, but are you really on that journey of understanding how you're improving? Right, exactly. Like what you're saying, Kaizen EBM is very inspired by the Kanban model, things like that.

Shane Hastie: One of the most common shifts that we talk about with organizations in this space, and it's an area I'm personally am passionate about, have truly written a book on it is the move from project to product. How do we inspire that sort of a shift in organizations?

Shifting from project to product thinking [11:18]

Patricia Kong: We've been trying for so many years. I don't know. We just keep trying. What is this? Over two decades? I think there's going to be this pressure that if you don't, it's not going to work. So there's this slightness of that. And there's so much industry data, right? When we think about just the way that companies make decisions and they're doing that, and it just gets so siloed, there's actually a lot of decisions that get made from an organizational perspective that either hurt performance or they don't do anything. And so they're not really taking a look like you said, at the product perspective, I think we went project to such a product. I wonder if there's something else there, right? If it's going to be something that's a lot more nimble. But the thing that I think about is if you really understand value, then you're going to start thinking about how you invest for value, which is how I think about how would you invest understanding your goals?

The problem is when there's product implications some people will say, when we're scaling, we're just scaling, not because of a product and what that needs, but because of just the amount of people we have. And I don't think that we're right now in an economic situation that will tolerate that for much longer. And I don't even mean the people, I just mean the market. So what we're trying to do is encourage people to think about, again, those outcomes, the gaps, the personas, all that good stuff, and then say, "You don't actually have a scaling problem. You have a product problem. You are trying to architect your products around an ancient system from years ago. And I understand that that used to be optimized for cost, but how do we have those conversations?" So all the other are things that come into play.

But I think what we've seen with EBM is when we were able to do that, right? Just start where we can, is that when we started to use the measures to show progress around the goals of what we were doing, it created a true alignment, and this is where it needed from really top down support and bottom up. We started to have a conversation. And what was amazing was the developers, the teams were saying, "I understand why I'm doing this now. And this is how we're making progress toward that. And not only that, here are some ideas and solutions that I would suggest because I'm very close."

Measuring and setting goals around outcomes not outputs [13:31]

Patricia Kong: So everybody is talking in that same language. We've seen in that same structure for this product, the middle management, which we usually call the frost layer. Right? All of a sudden, they're not being gaoled on output. They're go being gaoled on that same outcome. So now they're really expressing qualities of servant leadership. How can I remove the impediments? What are you struggling with? And we even saw it up to the executive level where when they're looking at these charts and these graphs, and they're having conversations, it's no longer about the progress of a project, but of a product. And that was really powerful that to understand that, and that behavior starts to slowly change some of the things. But honestly, I think we'll be having this conversation still in five years.

Current value and unrealized value [14:12]

Shane Hastie: So let's explore some of the factors that you were talking about, the elements of value as a starting point, current value, unrealized value. Current value makes sense to most of us, unrealized value, the potential ability to innovate. Why is that a business value metric?

Patricia Kong: What we actually look at this, it goes, if you have anything you want to pursue, well, then the ability to innovate is going to tell you from a value perspective, these things for us, really all contribute to value is that it's going to talk about how can we maximize the organization's ability to deliver new capabilities, right? So we're going to ask two questions, which is what is preventing the organization from delivering new value? And what prevents the customers or the users from actually benefiting from that innovation? Even if we've come just straight in here for an organization, a lot of the times, what we see is we just need to build new stuff, build new stuff and features. And we said, well, actually, here's some numbers that we've been taking on different versions that your customers are on, and you are spending $10,000 a month on your support team to support people who are all on older versions of your products.

So what do we need to do now? And this was, we chose from a technical perspective to start to make things a little bit more seamless instead of just throwing and adding more features. That's what the CIO wanted. He was saying, "Let's just build and add more stuff, add more stuff." And we said, wait a second. You're actually spending a lot of money here. If you want to improve your customer satisfaction and your employee satisfaction. That's the thing. Another interesting thing about ability to innovate that we've been thinking about is the relationship between innovation and ability to innovate and employees and humans and their ability to ask questions and to be curious. And if your employees aren't there, they hold a lot of knowledge, then that's going to hinder your ability to innovate. So that's something that I've been thinking about, and it's closely obviously related to your time to market because if you can't innovate, that's going to potentially affect your progress toward your goals.

Shane Hastie: And for again, the technical team leader, how do I influence this? How do I help?

Questions technical leaders should be asking [16:17]

Patricia Kong: So there's different things that I would start looking at. I would start saying, you know what, if I wanted to measure one thing, maybe two things. And I'm really thinking about my ability to innovate and how that helps or increases our time to market. I would start look at things like what is our time to learn and what is our time to pivot? So I would be working with a team to really basically answer those questions. What is preventing us from delivering new value? Do we have several branches? Are we scaled? And we can't integrate. We can't integrate, what is that cost? Right.

So now, now when we start to bring numbers and ability to innovate and time to market, when we look at something like scale and balance and all that, you will quickly find, I think in my experience that you will find a lot of costs that are being incurred because of all this overhead that we have. And that's going to be something that relates to your ability to innovate. When people will talk about value stream mapping or those things. That's a great place to look. What I would encourage is that for those technical leaders is to think of yourself as technical leaders, but also from a business perspective. So always driving that conversation back to value or else. I think that there will always be the stance that we're just technical leaders.

Shane Hastie: That's a bit of a mindset shift to culture shift. How do we help people along that journey?

Patricia Kong: I think the gurus of that are small children who created the five whys, right? The management technique they go, why, why, why? And honestly, that's what's gotten me onto my path right now, but really of being very curious and asking some questions that are very focused, not only to gather information from a fact, but when you are talking to the right people that have influence and also to the people who have influence on what could be the best solution is to ask them questions, to get them very curious. As a leader, those are the things that we should be doing that servant leadership is really pulling that out. And what I found that to be necessary is with not only at the management level, but also with the team level. So I was recently doing some research. And what they found was that from this curious perspective, that executives and management, think that curiosity is pervasive in organizations.

There was a study done by two gentlemen, I'm missing their names right now, but they did a study over 16,000 people. And over 80% of executives and management thought that curiosity existed in the organization. Super great. You get incentivized for these things, but that was much, much lower for the actual individual, contributors, the people on the team, they did not feel that it was okay to be curious. And the number one reason why was because they did not think that if they asked a question that they would get an honest answer, it's not because it was unsafe or risky. And so there's two things that are interesting there, especially from a leader perspective, from an agile perspective, is how do we encourage that so that we can actually innovate and not lock the great ideas, the innovation, the power up top, which additionally gets locked into annual budgeting, or once every two years, maybe we're doing really good now at six months.

So there's those things from a larger level that I would look at, but from just like, what could I do today? I would actually really start to have this conversation of who are my customers? What do they really want? How can that increase my time to market and ability to innovate? We have a gentleman in Canada who is working with a bank where they were convinced that they needed to develop this whole scaling initiative because they needed their mobile app to mirror their website up for the banking services. And when they actually went back and they said, "You know what, what do people actually want from a mobile experience?" And they said, the people who use the mobile, they only trust it for two things, look at their balance and transfer balances, maybe pay some other people.

And so they said, "Wow, we don't need to mirror everything on the website. Now we can have a really simple app. We can do this much faster to time to market." Current value customers were happier and they didn't need to have five teams. They were able to do this with two teams. And so when you start to lead with that stuff too, and you have numbers, now everybody can have some evidence to actually be curious about, and let's make some adaptations off of that. And I think obviously it's from at Scrum.org, we have the Scrum framework to drive that focus of how do we incrementally do these things? And to actually see that something's working because people will ask me, "Patricia, how do you know if EBM's working?" I said, "One way I know it's not working is nothing changes." So that's what we would look at.

Shane Hastie: Let's explore this concept of in the white paper that you've published. You've got this thing called the experiment loop and the small steps. What's the experiment loop to start with?

Introducing the experiment loop [20:55]

Patricia Kong: The experiment loop is exactly that. You have a hypothesis, you're going to try some stuff and you're going to measure it. And you're going to find out if it works. And if it doesn't, you make a decision, if it does, you make a decision and that's really just the cut Kanban model, right? It's just, let's just run an experiment. And we find that that is interesting because a lot of people, I think still don't think of, for instance, the product backlog items in their product, backlog, as an experiment, as a hypothesis. They're not thinking of about them in terms of, or is there a way to have that conversation? The other reason that this is really important. So for us, we'll say, "Okay, you want to be somewhere, you have the next goal that you're trying to reach. What is experiment that you're going to try to do to reach that goal in pursuit of a larger goal?" Is important because that gives us some time to eventually say, is that goal still valid?

And I think that, that's really something to think about right now as we are existing going back into the pandemic. And there's a lot of different other factors that we're looking at now, right? So people talk about the future work. How do we think about all these other dynamics of what being demanded from not only the customer, but from employees. There are a lot of goals and a lot of things that we are working on that are not valid anymore. And so what is really important to me is to think about this experiment loop, right? We're going to try something. We think about what the outcome is. We're going to figure out if it worked or not is to really think about that in a way that allows conversations so that we don't only go, hey, success for us as a team. If a leader is thinking about how they can help their team improve, it's not that success is just, hey, work's completed.

Things were done or it was just activity and output, really easy things to measure like you were kind of hinting at before. But it's really, if we're done, if success is here, we're trying to push that to the level of we're done. When we understand what value is delivered, we include that notion of feedback in there. So that's really what I want people to start to get that muscle of cycling through. It's essentially a empiricism, isn't it? That's all it is, is that we're really, really, really looking at that. But in the times that we've been seeing organizations work on this, especially when you talk about, I'm going to make structural changes, scaling as a game changer because we are starting to make decisions based off of information and an evidence rather than the loudest person in the world.

Shane Hastie: Again, that requires culture shift in many organizations, getting the honest answer, the fear of failure experimentation almost by definition says, sometimes we're going to get it wrong. How do we make it okay to get it wrong?

Culture that enables experimentation [23:43]

Patricia Kong: Well, we keep those experiments small. It's the same thing, right? So if we're going to say, hey, as we, and this requires a mindset shift, it really requires a lot of things. It requires a power shift. It requires a recognition of how power exists in our systems and in a company. But I think if is an experiment where we're going to say, we're going to try something and we're going to show information that, that worked, and we're going to start to realize why we're doing something and what's important. And that there's outcomes there it'll be a better place of working. The culture shift is interesting because people say, "Oh, you have to change the culture first." And we get it a lot of their stuff. And there may be some people, myself included who feel like, you know what, there's no better way to change a culture than just to get something done.

So this is again about what can the teams actually do, and let's get transparent and use the retrospectives and use those way to figure out how do we work together better so that we can be more effective in delivering value. There's so many other things that coaches, it's so valuable to see things, examine and help teams improve. And it might be those little shifts first, right? So is there I empiricism? Is there curiosity embedded in the meetings for us in the Scrum events that are there? Are they talking about outcomes? Are they talking about experiments or is everybody silent? And just saying, "This is a status update. This is the work that's been done." My best that's lipstick on a pig, isn't it? So there is that, but I would say can we see what successes are existing from the teams that are already doing this type of work?

And the reason that I think capturing evidence to do it, so saying, "Hey, how's our time to market? How's our ability to innovate?" And if that's the world you live in and that's all you can do because maybe you live in another country and you're responsible for this box of code, that's going to go into a car in Germany that you'll never see a first, some German person. If that's all you can do, that's great. If you understand why you're doing it, that might motivate you better.

But if we can have all this concept of value and what we're doing, I think that that evidence will be really handy. When for instance, the agile transformation is still trying to exist in the organization, but the CEO has swapped out. You now have to sell agile again, or do you? You can just take evidence and say, this is what's happened in this way that we work. This is what we've learned. I think that, that's important. And that culture shift like you were talking at marrying the notion of psychological safety, curiosity with accountability that creates a learning organization. And I think that, that's the future of where people want to work.

Shane Hastie: This concept of unrealized value, potential value. How do we even make that visible?

Making unrealized or potential value visible [26:24]

Patricia Kong: So that one is usually pretty hard for people to grasp. And there's something that we ask people to consider called dissatisfaction gap. And what I see from a lot of people who have a traditional mindset is that they look internally to what their satisfaction gap is. Where am I unsatisfied? Where are we dissatisfied with our organization? This is saying, where is that satisfaction gap for an end user or a customer using your product or service? What is their card experience? What is their desired experience? And is it worth it? In between for me, is it worth it to close that value? So that can look like something like, yes, there's a new feature. Yes. There's a new something. And how I've experienced a lot is actually quality issues because a quality issue means that I am unhappy with the experience that I'm having. Now, it's not quality. We need to close it.

There was a company in Germany... Really quick story. They created a COVID app really fast so that they could track people. If they'd been to a restaurant, all that was great. Right? We're helping the system, except all their data got exposed, right? So they did it really fast. It was really great. The government invested millions of dollars in it. And then all the data was exposed and they said, is it worth it for us to close it? That's an example for me, of unrealized value. Should we just maybe randomize the data? What are small things that we can do? Is there value in that? Or will people just keep using it because they don't care about their data anymore, it's COVID? Those kind of things I would think about.

Shane Hastie: If people want to find out more and keep in touch with you, where do they find more and where do they find you?

Patricia Kong: I'm reachable through the internet, just like you are now. So LinkedIn is a great way to connect with me, Patricia Kong, K-O-N-G. When people write a note, I'm really happy to have chats and talk about how to get started. But evidence based management, there's a guide just like the Scrum guide and Nexus guide that's available on the Scrum.org website. And this has been eight years coming. We recently released a one day workshop. That's available virtually that's in our leadership curriculum called PAL EBM. So professional agile leadership.

There's an assessment. Like we have those assessment certifications and people want to give their hand at that, but there's also an open assessment. And the reason I bring up the assessment is not so much for the certification, but we have a really great suggested resources page that we've put together for people to think about this notion of business agility, how to invest for business agility? What does it mean if you were to think about this with OKRs, how do we make those better with evidence based management? How do we think about outcomes? So there's a lot of stuff out there on the Scrum.org website.

Shane Hastie: Patricia, thank you very much for taking the time to talk to us.

Patricia Kong: Thank you for having me, Shane.

Mentioned

About the Author

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and YouTube. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

BT