BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Failing Fast: the Impact of Bias When Speeding up Application Security

Failing Fast: the Impact of Bias When Speeding up Application Security

Bookmarks
16:50

Summary

Laura Bell explores how bias impacts the security of a development lifecycle and examines 3 common biases that lead to big issues in this space.

Bio

Laura Bell has almost a decade of experience in software development and information security, specializing in bringing security practices and culture into organizations. Laura is co-author of Agile Application Security from O’Reilly Media, a member of the board for non-profit Hackers Helping Hackers and a program committee member for the O’Reilly Security Conference in New York.

About the conference

QCon Plus is a virtual conference for senior software engineers and architects that covers the trends, best practices, and solutions leveraged by the world's most innovative software organizations.

Transcript

Bell: My name is Laura Bell. We're going to be talking about failing fast, not in the cool way. We all want to fail fast and learn things, but some of the ways we can fail fast by overlooking our own bias when we're trying to speed up application security.

The Desire to Go Faster Is Human

The desire to go faster is really natural. We're all quite lazy as people. Humans are a lazy species. We want to go fast. We want it to be easy. Application security is no different. We want to make sure that when we're securing our applications, we're not getting in the way of that progress, of that speed. Some of the ways that we behave and some of our innate biases do get in that way. They create weaknesses and vulnerabilities where there shouldn't be them. We're going to talk about three of those. Specifically, we're going to talk about seniority bias, tool bias, and recency bias. How each of these can manifest inside your teams and your practices to create vulnerability and problems that could lead to security issues later on. We're going to round out by overcoming that bias, and learning how to go even faster.

Agile, DevOps, and DevSecOps

I'm going to start with a little bit of a caveat. I was the co-author on the book by O'Reilly Media of, "Agile Application Security." Whether you do agile, DevOps, DevSecOps, or something in between, I really don't mind. This talk isn't about particular labels. Security isn't about particular labels or dogmas or styles. Security is security. We want to protect things, whatever terminology we're using for our way of working. Whatever label you use, just park that for now. It doesn't matter. Let's just assume we all want to go fast and we all want to get more secure as we go.

Seniority Bias

The first bias we're going to talk about is seniority bias. That's the idea that those of us who are more senior, a little bit older, you've been doing it longer, are somehow better at securing our applications than everyone else. This is the type of bias that manifests with sentences like this occurring in your team, "We can only go faster if we hire experienced or senior engineers." This is the type of language you will see in high growth companies who think everyone needs to be 10X, or ninjas. I hate those phrases, so sharing my own bias there.

Symptoms of Seniority Bias

Let's do a little bit of a self-assessment. The symptoms of seniority bias are as follows. Does your team only advertise roles or recruit for senior level engineers? Is it difficult for people to come into the engineering space in your organization, so there's a lack of juniors or a lack of people transitioning between technical specializations? Are named individuals responsible for peer review, or pull requests, or code changes? We also call this insecurity, key person risk. The idea that there's one person who is central to processes, and if they take a day off, everything stops. Look at your world, are you that key person? Are you that named individual, the only person who knows about a particular system or a particular style of application, and so everything goes through you? If that's you, we've got a problem with seniority bias.

We also have assumptions around common sense. I've been guilty of this myself, you may have too. Saying that, it's common sense. If you're a senior engineer, we all know how to do security and to solve complex development challenges. The reality is, we don't. We all come from different backgrounds. Those backgrounds give us different sets of skills and experience. Some of that will include security knowledge, but for some, it definitely doesn't. Just because you've been doing it a long time, it doesn't mean you've learned the same things as everyone else on the way.

Impact of Seniority Bias

How do we start to overcome the impact of seniority bias? We start by understanding what that impact is. What could go wrong, and how can it affect our security? The first impact is burnout of key people. If people can't take breaks or leave, or they're responsible for every single request in your system, they're going to burn out. Burnout is a big security issue. We don't make good decisions when we're burnt out. There's high impact if team members leave the organization. We don't want this. If Sal or Taylor leave, we want to make sure that somebody else can pick up what they're doing, really easily. It doesn't mean heavy documentation but it does mean processes that share the knowledge equally across the team.

We can also see a stagnation of approaches, because those of us who've been around a little bit longer are used to doing things in one way or another. It might be difficult for new voices or new approaches to be heard. Some of us are really lucky, you're here at a conference and you're expanding your range of skills and approaches, but not all seniors have that luxury. We might find ourselves stuck in the same patterns of doing things we have been doing for a number of years. Equally, knowledge sharing across our teams and cross skilling will be really poor. That's because everything is going to the same person. If they're busy, they're not going to have that time to share. Nobody else is getting that experience naturally as part of their roles. That impact is really high. That seniority bias means we're not getting new security approaches. Then if one person breaks under the strain, or takes leave, then we have a massive impact on the team. Don't let the security of your application be based on whether one person is ok and doing well in their job today.

Tool Bias

Tool bias is our next bias. We're going to round up at the end with how we can overcome all of these biases. We're going to do them first, one after the other. The symptoms, the quotes you might hear if you've got a tool bias are things like this, "We can only go faster if we buy this tool or system." We're at a conference. Many conferences will have a tool bias in them. There's tools that have been talked about in these talks, "You should do this. You should try that." "Here, this big company did this thing and they used this tool, so they were able to do this thing faster." We are bombarded in every aspect of our engineering life with marketing material that tells us tools will solve our problems. As engineers, we also know that that's rarely actually the case. Tools are a great asset, they can help us go faster. They also come with a real problem set of biases.

Symptoms of Tool Bias

Let's do our self-assessment. I want you to play along. There's two symptoms of tool bias. You have a tools bias if you're spending thousands of dollars on tools and systems to integrate into your development lifecycle. Not every tool needs to cost you a lot of money. There's a great deal of amazing, free open source tools out there. Not everyone needs to be spending that much money. Or, if they are spending it, there may be other things that they could be spending on instead of tools. Do you have tools purchased but not properly implemented into your build pipeline? Maybe they were put in and then they were removed because they were causing you pain. Or maybe you got them and you put them in a learning mode, but you never got them fully installed. That's a tool bias. You've spent the time and focus because the tool will solve the problem, but we've not actually solved the problem. We've got halfway there and stopped. If there's no plan for maintaining, tuning, or configuring tools post-purchase, also known as a sales-person driven development style, then you've got a tools bias. Your tool has not made you more secure. Your tool has given you the feeling of security, but without the actual action.

Impact of Tool Bias

What's the impact of this? Tools purchased a long time before they're required in the pipeline, cost time and money, and they are a distraction. If you are not ready to do a full CI/CD pipeline, buying a vulnerability scanner or a software analysis tool that is going to sit in that pipeline, is too early. You're going to have tens of thousands of dollars' worth of software that you're not going to be using. That money could be used on other things like ensuring quality of your CI/CD pipeline, or testing automation, or upskilling everyone to make sure that their new style of working, this new development pattern is well embedded.

Tools might be chosen without analysis of the impact on workflow. I'm a security person. We're guilty of this. We will say, "Development team, here is a tool, you should go use it." Often, we haven't considered the workflow properly when we suggest those tools in. Those tool decisions need to be made between software team and security to get this right. We have to analyze that impact as you go. Tools and processes are offboarded out of development teams creating a fractured approach. We see this a lot in larger organizations who are lucky enough to have big, fully resolved security teams, where tools can't go into the pipeline because they're going to break it, they're going to slow it down. What happens instead is they take the tool out and say, "Security team will run this tool and bring the results back in." This creates a really fractured approach where not only are the tool outputs coming back in at the wrong time in the flow, but the impact of that workload isn't really being realized by the outsource team.

I like to call this the gym membership effect. I've definitely been guilty of this myself. This is an organization believing it's solved its issue at purchase. If you, like me, have ever bought a gym membership because you wanted to get healthy. You feel great, you've bought your gym membership. You're going to go, it's great. Buying the gym membership doesn't get you healthy, going to the gym does. A tool bias is the same deal. Buying the tool is not the improvement, using the tool, refining your usage, and making it work for the whole area, that's where the impact is.

Recency Bias

A final bias we're going to talk about is recency bias. We're at a conference, so there's a slight irony in this one. In fact, people like myself who speak at conferences are guilty of propagating this, but we're going to talk about it and how it affects your security. The idea that we can only go faster and be more secure if we do the latest thing that we read about. "I went to this conference talk and they said I should do a thing, and so we should do that right now." We switch very rapidly between our approaches. We bring in new tools and frameworks. We switch to, the latest front-end framework is this, and we should do that instead. If we're always switching to the most recent thing that we've heard, it has a massive impact on what we're able to achieve with security.

Symptoms of Recency Bias

Let's self-assess. Have you got a recency bias? Lots of started but incomplete security initiatives. You've got a security backlog, but not a lot is getting completed on it. Initiatives are losing momentum after initial kickoff or failing to achieve measurable outcomes. That's because you're switching to new things all the time, the latest shiny project. If you've got that then you want to be careful. Frequent focus shifting and difficulty understanding the overall security approach often leads or contributes to recency bias. If you got that magpie approach, as we've called it down here, chasing shiny things, then you've got a recency bias. In security, that's really dangerous. It will create a massive backlog of security initiatives that need to be finished, spend a lot of money, but won't actually achieve the application security outcomes that we need.

Impacts of Recency Bias

What are the impacts then of this recency bias? Time and money wasted on many started projects. We all feel good starting a project. It feels great. It's shiny and new and exciting. If we don't finish it, then it's not worth the effort. We need to be actually going on and completing these things. It gives the perception of working hard, but it doesn't achieve measurable aims. Depending on how performance review works in your organization, I've seen this before. We've all seen this before. If I start these new things, everyone looks at the star, they give me a gold star. Then I don't really have to finish it because everyone's forgotten. It doesn't work for security. If we don't finish it, we don't stay secure.

It undervalues the complexity of some initiatives. If you find that actually you're putting everything on and starting everything very quickly, with a fast focus switching approach, you may miss the fact that some of these projects or some of these initiatives are really quite hard. They need time and focus for you to get through them. It often leads to teams being overstretched as they try and take on too many things. Small security teams, as I've found in most parts of the world, are already overstretched. Even smaller application security teams don't have a lot of resources. They don't have a lot of energy for new initiatives. Pick very carefully, and don't switch constantly. Otherwise, they're going to have a real headache from context switching, and loss of focus.

Overcoming Bias

How do we overcome this bias? We all have a bias. We have many bias. It's a description not a justification, though. Just because we have the bias, it doesn't mean that that's acceptable. I know that I have many bias in the way that I approach the world. I know that not a lot of those bias are healthy, and not all of them make me more effective at my job. Your job is to spot which bias you have and how it's affecting the effectiveness of your application security projects.

Challenging Bias

How do we challenge them? Firstly, we challenge them with discipline. Overcoming innate behaviors doesn't happen by itself. It takes self-awareness and discipline to do that. You need to look at yourself, answer our self-assessment questions, and go, "Yes, I have a problem. I'm going to do something about it." It takes focus. Hiring broadly, taking it slowly with tools, and doing less stuff might feel like the hard way round. Nobody wants to stand back at a performance review and say, "I did less this year, but I did it really well." That feels like a really unnatural way to do things. I want you to do that. I want you to buy less tools. I want you to hire more broadly. I want you to do less projects this year. Make that your aim for the year. Do less projects but do them at a higher level. Make them phenomenal. I want you to do this with consistency. It's really easy to fall back into old habits when we're pushed for time, or we're stressed. It's when we need to focus and have discipline the most.

When we are stressed, we will fall back on whatever behavior is easy for us to do. That means the ones where we don't have to think, the muscle memory. If your muscle memory is to favor a senior member of your team, or just say, "Sorry, I'll just do it this time. Don't worry about it. It'll be quicker if I do it." That's that inconsistency. That's that discipline that's being overridden because of the stress. Every time, even when you're stressed and busy, I want you to think about this, am I using the broad skills in my team? Am I favoring seniors over juniors? Am I taking it too fast or too slow? Am I favoring a tool and thinking I've got benefits where I don't? Am I doing too much at once? Do less. Buy less tools. Use all of the skills and experiences in your team, whether they're senior or not.

Takeaway

Fast-paced security isn't easy, and there's no quick fix. There are many talks about how to put security into your pipeline. There's very few people talking about how we get in our own way when it comes to securing your applications.

What I want you to take from this is actually aside from the tools and aside from those approaches, there are little bits that each of us can look at with the way we behave that could be impacting our application security. By keeping our bias in check and calling out the bias we see around us, we might be able to improve the situation.

If you have any questions, please join me in the chat, or please reach out to me either at hello@safestack.io, or @lady_nerd on Twitter.

 

See more presentations with transcripts

 

Recorded at:

Feb 11, 2021

BT