BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Failing Fast: the Impact of Bias When Speeding up Application Security

Failing Fast: the Impact of Bias When Speeding up Application Security

Bookmarks

Key Takeaways

  • When dealing with application/network security people can make some mistakes meant to quickly solve some issues but they can cost in the long run.
  • There are three biases people can have:
    • Preference for seniority
    • Using latest tooling
    • Tendency for latest technologies
  • This article provides advice on how to spot if one of above mentioned biases is at work and how to deal with them.

We can fail fast by overlooking our own bias when we're trying to speed up application security.

We want to go fast. We want it to be easy. Application security is no different. We want to make sure that when we're securing our applications, we're not getting in the way of that progress, of that speed. Some of the ways that we behave and some of our innate biases do get in that way. They create weaknesses and vulnerabilities where there shouldn't be. 

We're going to cover seniority bias, tool bias, and recency bias. How each of these can manifest inside a team, then practices that can create vulnerabilities and problems that could lead to security issues. A detailed overview of these was covered in this recent talk I gave at QCon. 

Seniority Bias

The first bias we're going to cover is seniority bias. That's the idea that those of us who are more senior, a little bit older, have been doing it longer, are somehow better at securing our applications than everyone else. This is the type of bias that manifests with sentences like this occurring in your team, "We can only go faster if we hire experienced or senior engineers." This is the type of language you will see in high growth companies who think everyone needs to be 10X, or ninjas. I hate those phrases, so I’m sharing my own bias here.

Symptoms of Seniority Bias

The symptoms of seniority bias are as follows:

  • Does your team only advertise roles or recruit for senior level engineers? 
  • Is it difficult for people to come into the engineering space in your organization, so there's a lack of juniors or a lack of people transitioning between technical specializations? 

We also have assumptions around common sense. I've been guilty of this myself, you may have too. Saying that, it's common sense. If you're a senior engineer, we all know how to do security and to solve complex development challenges. The reality is, we don't. We all come from different backgrounds. Those backgrounds give us different sets of skills and experience. Some of that will include security knowledge, but for some, it definitely doesn't. Just because you've been doing it a long time, it doesn't mean you've learned the same things as everyone else on the way.

Impact of Seniority Bias

The first impact is burnout of key people. If people can't take breaks or leave, or they're responsible for every single request in your system, they're going to burn out. Burnout is a big security issue. We don't make good decisions when we're burnt out. Also, there's a high impact if team members leave the organization. We don't want this. If Sal or Taylor leave, we want to make sure that somebody else can pick up what they're doing, really easily. It doesn't mean heavy documentation but it does mean processes that share the knowledge equally across the team.

We can also see a stagnation of approaches, because those of us who've been around a little bit longer are used to doing things in one way or another. It might be difficult for new voices or new approaches to be heard. We might find ourselves stuck in the same patterns of doing things we have been doing for a number of years. Equally, knowledge sharing across our teams and cross-skilling will be really poor. That's because everything is going to the same person. If they're busy, they're not going to have that time to share. Nobody else is getting that experience naturally as part of their roles. That impact is really high. The seniority bias means we're not getting new security approaches. Then if one person breaks under the strain, or takes a leave, there can be a massive impact on the team. Don't let the security of your application be based on whether one person is ok and doing well in their job today.

Tool Bias

Tool bias is our next bias. The quotes you might hear if you've got a tool bias are things like this, "We can only go faster if we buy this tool or system." We are bombarded in every aspect of our engineering life with marketing material that tells us tools will solve our problems. As engineers, we also know that that's rarely actually the case. Tools are a great asset, they can help us go faster. They also come with a real problem set of biases.

Symptoms of Tool Bias

You have a tools bias if you're spending thousands of dollars on tools and systems to integrate them into your development lifecycle. Not every tool needs to cost you a lot of money. There's a great deal of amazing, free open source tools out there. Not everyone needs to be spending that much money. 

Do you have tools purchased but not properly implemented into your build pipeline? Maybe they were put in and then they were removed because they were causing you pain. Or maybe you got them and you put them in a learning mode, but you never got them fully installed. That's a tool bias. You've spent the time and focus because the tool will solve the problem, but we've not actually solved the problem. We got halfway there and stopped. If there's no plan for maintaining, tuning, or configuring tools post-purchase, also known as a sales-person driven development style, then you've got a tools bias. Your tool has not made you more secure. Your tool has given you the feeling of security, but without the actual action.

Impact of Tool Bias

What's the impact of this? Tools purchased a long time before they're required in the pipeline cost time and money, and they are a distraction. If you are not ready to do a full CI/CD pipeline, buying a vulnerability scanner or a software analysis tool that is going to sit in that pipeline, it is too early. You're going to have tens of thousands of dollars' worth of software that you're not going to be using. That money could be used on other things like ensuring quality of your CI/CD pipeline, or testing automation, or upskilling everyone to make sure that their new style of working, this new development pattern is well embedded.

Recency Bias

A final bias we're going to address is recency bias. The idea that we can only go faster and be more secure if we do the latest thing that we read about. "I went to this conference talk and they said I should do a thing, and so we should do that right now." We switch very rapidly between our approaches. We bring in new tools and frameworks. If we're always switching to the most recent thing that we've heard, it has a massive impact on what we're able to achieve with security.

Symptoms of Recency Bias

Have you got a recency bias? Lots of started but incomplete security initiatives. You've got a security backlog, but not a lot is getting completed on it. Initiatives are losing momentum after initial kickoff or failing to achieve measurable outcomes. That's because you're switching to new things all the time, the latest shiny project. If you've got that then you want to be careful. Frequent focus shifting and difficulty understanding the overall security approach often leads or contributes to recency bias. 

Impact of Recency Bias

What is the impact then of this recency bias? Time and money wasted on many started projects. We all feel good starting a project. It feels great. It's shiny and new and exciting. If we don't finish it, then it's not worth the effort. It gives the perception of working hard, but it doesn't achieve measurable aims. We need to be actually going on and completing these things.

It undervalues the complexity of some initiatives. If you find that actually you're putting everything on and starting everything very quickly, with a fast focus switching approach, you may miss the fact that some of these projects or some of these initiatives are really quite hard. They need time and focus for you to get through them. It often leads to teams being overstretched as they try and take on too many things. 

How do we overcome this bias? We all have a bias. We have many biases. It's a description not a justification, though. Just because we have biases, it doesn't mean that that's acceptable. Your job is to spot which bias you have and how it's affecting the effectiveness of your application security projects.

Challenging Biases

How do we challenge them? Firstly, we challenge them with discipline. Overcoming innate behaviors doesn't happen by itself. It takes self-awareness and discipline to do that. You need to look at yourself, answer our self-assessment questions, and go, "Yes, I have a problem. I'm going to do something about it." It takes focus. Hiring broadly, taking it slowly with tools, and doing less stuff might feel like the hard way around. Nobody wants to stand back at a performance review and say, "I did less this year, but I did it really well." That feels like a really unnatural way to do things. We need to buy less tools. We need to hire more broadly. We need to do less projects this year. Make that your aim for the year. Do less projects but do them at a higher level. Make them phenomenal. Do this with consistency. It's really easy to fall back into old habits when we're pushed for time, or we're stressed. It's when we need to focus and have discipline the most.

When we are stressed, we will fall back on whatever behavior is easy for us to do. That means the ones where we don't have to think, the muscle memory. I want you to think about this, am I using the broad skills in my team? Am I favoring seniors over juniors? Am I taking it too fast or too slow? Am I favoring a tool and thinking I've got benefits where I don't? Am I doing too much at once? Do less. Buy less tools. Use all of the skills and experiences in your team, whether they're senior or not.

Takeaway

Fast-paced security isn't easy, and there's no quick fix. There are many talks about how to put security into your pipeline. There's very few people talking about how we get in our own way when it comes to securing your applications. What we can take from this is actually aside from the tools and aside from those approaches, there are little bits that each of us can look at with the way we behave that could be impacting our application security. By keeping our bias in check and calling out the bias we see around us, we might be able to improve the situation.

About the Author

Laura Bell has almost a decade of experience in software development and information security, specializing in bringing security practices and culture into organizations. Laura is co-author of Agile Application Security from O’Reilly Media, a member of the board for non-profit Hackers Helping Hackers and a program committee member for the O’Reilly Security Conference in New York.


 

Rate this Article

Adoption
Style

BT