BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Security Culture: Why You Need One and How to Create It

Security Culture: Why You Need One and How to Create It

Bookmarks
45:11

Summary

Masha Sedova talks about how to measure an organization's current security culture and how to define where to go. She looks into techniques and cases studies of how to begin to shape an organization’s security culture to become more resilient and enable people-powered security.

Bio

Masha Sedova is co-founder of Elevate Security delivering the first people-centric security platform that leverages behavioral-science to transform employees into security superhumans. Before Elevate, she was a security executive at Salesforce where she built and led the security engagement team focused on improving the security mindset of employees, partners and customers.

About the conference

Software is changing the world. QCon empowers software development by facilitating the spread of knowledge and innovation in the developer community. A practitioner-driven conference, QCon is designed for technical team leads, architects, engineering directors, and project managers who influence innovation in their teams.

Transcript

Sedova: To begin, I'd like to talk just a little bit about my background. I have a background in cybersecurity. After school, I initially studied forensics, and then, worked for the defense contracting world at the Cyber Analyst. I'm originally from Russia and I had the chance of working on the Russian cyber threat, which was both fascinating and terrifying. I did that for several years before getting the chance to start a program at Salesforce, in 2012, called the security engagement program. Initially, it started as a cyber-threat program, then evolved gradually to the entire security focus for the company also known as The Human Element.

Getting employees at Salesforce to want to do security, not have to, and getting behaviors like phishing or pharming malware incidents reduced. Then, focusing on security development practices, or finding and fixing bugs. Then, [inaudible 00:01:14] SalesForce employees [inaudible 00:01:14] customer [inaudible 00:01:17] around a security-featured option like 2FA or [inaudible 00:01:20] descriptions.

What I found when I stepped into this role is that, when I started this work and they said, "Masha [Sedova], it's your responsibility to get people to make less security mistakes." I said, "Great, what tools do I have at my disposal?" I was given some animated PowerPoints and some newsletters. This is what we've got. [inaudible 00:01:48].

I set out on a journey today [inaudible 00:01:56]. What I [inaudible 00:01:59] discover [inaudible 00:02:01] behavioral science. Behavioral science can study how [inaudible 00:02:04] we make decisions. I [inaudible 00:02:07] since then stitching two fields together. My [inaudible 00:02:11] security [inaudible 00:02:17] focused on delivering a platform [inaudible 00:02:20]. [inaudible 00:02:22] security [inaudible 00:02:24] existing datasets. [inaudible 00:02:27] personalized reputations for every employee based exactly on what they need to know. That's [inaudible 00:02:38] a little bit to how I got here today.

Customer Trust Is Built on Security

If we just take a step back and say, "Why do we even care about the conversation of security?" Security is one of the foundations of trust. No matter what companies we work for, we need to have some type of customer, someone that keeps [inaudible 00:02:59] that we serve. Customers need trust in order to make this transaction functional, so an effective and successful company has a level of trust established with their customers.

This is an interesting report that came out, by Salesforce, a couple of months ago and they found that customers who trust a company are willing to buy more products, use them more frequently, spend more money, basically trust equals better businesses, no matter what kind of industry we are all in. At the end of the day, when we're talking about building security in our companies, we're talking about building trust with our customers. Even if we look at ourselves, our personal spending habits, how many of us would choose to give our credit-card data to a company that's regularly getting hacked or has poor architectural choices where we don't trust our personal information? We don't. Or most of the time we don't.

This is the foundation of why we're even having this conversation. When we think about building security in our organizations, that may mean different things to each and every one of you. That could mean better architectural choices, better product, better threat modeling, better processes, better reporting. At the end of the day, it's the foundation of how we make security decisions in our organization.

As you can see, that's not the reality for many organizations in our society. It depends on what day you open up your newspaper but, everything from personal mistakes around misconfigurations to failure to patch. Even the Verizon Data Breach Report, which is a combination of the last known hacks in a calendar year and why they happened, to whom they happened, and who the perpetrators were, that analysis said something like 52% of breaches had an incident of hacking, so poorly-executed or poorly-designed code. We clearly see that this is not something we've figured out yet as an industry.

What do the companies that don't make the headlines have in common? The companies that are most trusted by customers and regularly do the right thing who don't make the headlines have a culture of positive security that underlines them all. Before I get into defining what in the world I mean by positive security culture, I want to take a huge step back. Let's just talk about culture so we're all on the same page.

What Is Culture?

Culture is something that is incredibly difficult to define. Whenever you leave one culture and enter another, you know you switched. It's like the water that we swim in all the time - you can't really see it but you know when you're no longer in the same ocean. It shows up in the way we talk, the way we interact, the way the clothes we wear, and the norms. It's defined as, "The way we do things around here."

I really like this iceberg model because culture has so many facets and components to it. There are the things that we observe on an ongoing basis, the things we do. These are the behavioral artifacts. Then underneath it, the reason that we do these things are based on our beliefs, our mindsets, our values, and our assumptions.

I'm going to take an abstract case here. The company DuPont is a manufacturing firm, a chemical manufacturing firm, and they, above all, value safety. The way that you can observe their safety culture is, every time you get into a car on their parking lot, you have to buckle up. If you get caught driving on their parking lot without a seat belt, they will fine you an absurd amount, even if you're below the speed limit. No one in a car will start the car until everyone's buckled up. The reason why they have this is because they, in their past, have had terrible experiences where safety was violated, it cost people's lives maybe many decades ago. It has become a core value and has passed on to the company that, "Ok, we prioritize people's lives. Accidents do happen, seat belts save lives," and so, the things that we observe are these behavioral artifacts of people buckling up.

If we take that into the world of security, if we work for an organization where people are repeatedly clicking on malicious links, checking in code that hasn't been tested, for example, ignoring security policies that get put out, it's because there is a subset of beliefs values and assumptions that, "Security is not important, doesn't apply to me, will likely not happen here. If it does happen, it's not my job, it's somebody else's job." There are some underlying root-cause beliefs here that are preventing people from doing positive security behavior because they don't actually think it's a problem.

One of the things that is really interesting and I've observed in my career is the element of experiences. For those of you who've ever been at a company that experienced a breach before, after the breach did you find that you valued security in your own work more? Raise your hand if that's the case. 80% of you. I've seen that myself in working with executives, companies who have leaders, who've had to survive a security breach before get it way sooner than companies who are, "Yes, that's not actually a thing, that's not going to happen."

Further, if we can actually get a company go through a breach to really prioritize security, things like simulated attacks - red-team attacks, especially really well simulated ones - tend to get people really scared. The reason why we even talk about culture in the first place is summarized most eloquently by this quote by Peter Drucker; he says, "Culture eats strategy for breakfast." We may have the best-laid plans around how we're going to do our code reviews and how effectively we're going to ship process, but what actually happens is entirely determined by the culture that is set in our organization.

Security Culture Is A Subset of Enterprise Culture

The security culture that exists in our organizations is a subset of the enterprise culture that we belong in.

I want to give you an example of how I've tried to create a security culture that was a mismatch from enterprise and why there was complete organ failure in a network. When I first started at Salesforce, I began building an insider-threat program. It was designed to look at people who had access to sensitive information and potentially were traveling to high-risk countries. As you can imagine, it had a certain level of Investigations associated with it. When I went to ask for permission from the developer org, I said, "Developers tend to have the greatest amount of access. Can we work together on this?" they were almost hostile to me. They said, "What our employees do in their off hours is completely off limits and you do not have right to look into this," which is really interesting because my boss, the head of security at Salesforce at the time, was, "You should start this program, this is a great idea, this is a threat."

What ended up happening is the culture of Salesforce at the time and still had a turn that they call themselves Ohana, Aloha. They wore Hawaiian T-shirts every Friday, it was a very family-friendly kind of environment. The concept of investigations was so foreign that the program was pretty much outright rejected by the organization as a whole. There's a mismatch here in what security team wants to do if it doesn't align with the enterprise culture as a whole.

Positive vs Negative Security Culture

What I want to talk about is going back to the concept of what positive security culture looks like from a perspective of successful companies. Positive-security culture - does anyone have any thoughts? What are the attributes? Give me one or two words that come to mind when you think about a company that has positive-security culture. Just shut it out.

Participant 1: No blaming.

Sedova: No blaming, yes. Ok to make mistakes.

Participant 2: Reporting phishing emails.

Sedova: Reporting? Just proactive, yes. How about a negative security culture? What are some things that you've seen that happen in negative security cultures?

Participant 3: It's not my problem.

Participant 4. Shoot the messenger.

Participant 5: It will never happen to us.

Participant 6: Follow the links.

Sedova: It sounds like you guys have some experience with negative security cultures. Let's take a look as to why that happens. How many of you are familiar with the, "You only get to pick two out of three" here? The one on the left is little bit more familiar - you can have money and time, you can have money and children, but you cannot have all three. Anyone who's a parent probably finds that extra funny. When we think about it from a purchasing perspective, same thing, price, quality or service. You can't have three at the same time. You have to pick two. What's happening here? We have to choose our priorities, and often in an organization, it boils down to, "You can't have all three things." You can't have speed, you can't have growth, and security at the same time. What are you going to choose as a company?

I want to tell you the story of Clara. Clara is a developer and in organization named Acme. Clara has to ship code. She has about 10 days worth of work to do that involves writing her code, writing all the test cases, and making sure that it is checked for security quality. The problem is Clara only has 7 days on the calendar. What is she going to do? Is she going to ship her code on time but poorly-tested? Or is she going to ship her code late but fully tested and buttoned-up from a security perspective? What she's going to do is entirely dependent on the culture that she resides in. How does she know what culture she resides in?

Parpicipant 7: What's rewarded?

Sedova: More importantly, what she's going to get punished for. It's, "If I ship late, am I going to get fired? Or if I'm going to ship insecure code am I going to get fired?" This gets repeated to Clara 1,000 times a day in really small micro incidents across the organization. She sees it as it happens to her fellow employees, she hears it on the all-hands call, she sees it in the newsletters that come out from the organization about what's rewarded and reinforced in her organization. If things like deadlines, costs, her bonus are tied to what she does and security is a choice that she has to make on top of that, she will always forego security because no one has ample time. What we see here is this decision that Clara has to make in that moment. What happens? Let's say she works in an organization that has a negative security culture. I define here "Negative" as one that does not prioritize security in a positive light. It's a nice to have, not a must have. It is not punished when it doesn't happen and rewarded for when it does happen.

She makes this decision and she ships this code, and nothing happens. Phew, she got away. The thing is Clara is starting at the very bottom of this triangle. She's starting at the very bottom. When she starts, nothing happens, life's ok. What she's done, she's introduced a small amount of security risk to her organization, just a little bit of security debt. Now there's an unpatched server out there somewhere. Or there's a piece of code out there that's vulnerable to cross-site scripting. That's ok, no one's found it so it's ok. She makes the next decision, and the next decision. Over time, it collectively keeps accumulating. Obviously, Clara is not the only person in her organization, this happens across every developer every day.

What slowly begins to happen is the security debt of the company aggregates to the point when a breach happens, when they become a headline, when one of these vulnerabilities get found. All of the sudden, we say, "We should have focused on patching." What you should've focused on actually is this tension. Every time Clara makes a decision, what's she being reinforced for? This is a concept called "Security drift." It's not like the one time you make a decision, you're screwed, it is the aggregate of the thousand decisions that get made in our organizations every day.

The Competing Security Cultures Framework

When I say, "Security culture," and say, "We have a positive security culture," what I think in my mind is security culture and what you think in your mind is security culture might actually be two very different things. The reason that is, is because all of our companies actually prioritize very different things, as far as it relates to accomplishing security goals. There are some basics which involve that we have to patch and we have to not get hit by phishing attacks but the underlying reason why that happens are actually different, depending on the organizations we work in.

This is a really awesome tool called the Competing Security Cultures Framework, it's a bit of a mouthful but there is a free survey, an assessment tool that's available. When I share out this deck, to also share the link to it. That is a set of 10 questions that you can use to assess your own culture. What I want to show you in this mapping is that there is, on the X-axis, internal focus and external focus. Who are you beholden to? Do you have tight control or loose control over your organization? If we see, in the upper left-hand corner, we have process cultures. Often, these are companies or organizations that are very process-oriented. Government falls into this a lot. One of the cardinal directives is they really want to enforce policy. I succeed when, "Here's the policy that I laid out and you follow it." "I said no phishing, how many times did you violate that phishing policy?"

Then, as we move over to the right-hand side, we have compliance culture. Organizations that are big on compliance culture tend to fall in things heavily regulated, like pharmaceuticals or insurance companies. Their goals are passing audits because people are all up in their business all the time, so they have to meet those checkmarks, and that's really important to them.

Moving to the bottom right, we have autonomy culture. These are companies that are often smaller and more agile, partially because they're less resourced and partially because they have less overhead and their cardinal directive is, "Just get stuff done. Just get results." It doesn't really matter how, the policy gets managed, but it's, "Just patch the thing, ok?" kind of thing.

Then we move to trust cultures, and a lot of tech companies tend to fall in the trust-culture quadrant. The goal here is to empower people. Just taking a slightly deeper look into what this looks like, there is no wrong way to lean in this quadrant, but you can see that every culture has slightly different things that they value and prioritize for themselves. But then, if you say, "What's more right? What is a good culture and what is a bad culture?" I'll say, "We all need to pass audits and it'd be great if we could enforce policy. Getting results sounds great, as is empowering people. I would like a little bit of each, please," and that's totally ok. It's important, just from this exercise, to acknowledge though where you are and if that is actually where you want to be.

Results of SCDS

What happens is, at the end of this assessment, this 10th questionnaire, you'll see that often it's this diagram of distribution of scores. You can see that, for this example, the company ABLE is heavily in the compliance organization and very light in the trust organization. If I were to go into this organization, what I would likely find is that people do not particularly feel empowered to make decisions, they feel like security is there to enforce policy and audit. They do not feel like they are part of the story of detecting attacks. What I would define as leaning towards a negative security culture as it relates from an end-user perspective. "Someone is telling me what to do, and if I don't do it, I'm going to get in trouble" kind of thing.

The bottom two, trust and autonomy, tend to lean a little bit more into the human element and helping people feel like they're part of the security story.

Root Cause Analysis

That's a little bit about cultures and positive cultures. Now I want to look to, "How do we drive change?" Let's say I'm heavily into compliance culture. Or I have repeated, "I know that I have a negative security culture," people are, "That's not my job. I don't want to engage with security." What are the tools that we have at our disposal to actually start influencing that change?

How many of you have ever heard of the five whys? Can somebody tell me what the five whys are?

Participant 8: It's a way of diving deeper and deeper into a problem until you get to the bottom. You keep on iterating, "Why did this happen? Why did you do that? What made you do that?" until you get to the bottom of it.

Sedova: Right. I think you actually used every word on this slide. My favorite example of how this has been used in a business example was what GM did with their OnStar program. What they found was, "A lot of people were coming and saying, "My alternator isn't functioning.'" They said, "Why?" "It was broken," "Why?" "It was beyond its useful license," "Why?" "I haven't really been maintaining it," "Why?" "Because I had no idea it was supposed to be done." If they had just started fixing the first problem, which is, "My alternator isn't functioning," they would be selling alternators. What they realized they actually needed to do was implement a service strategy that told people to start maintaining their cars. What they ultimately launched was a service that notified people for regular maintain schedules via an app, and it ended up being a 2 billion dollar business for them, which is a significant significant change from just selling more alternator belts.

Investigate Root Cause

In my experience of having done root-cause analysis for security problem sets, for secure development but also general security best practices, from phishing, to reporting, to malware, I have found that the root-cause analysis usually tend to be one of these five key answers. One, "I din't even realize security was part of my job," like, "Isn't there a security team somewhere that gets paid 9 to 5 to do it? Why is it my job?" The second thing is "I didn't know what to do about it. Yes, I know it's a problem but, what am I supposed to do about this?" Then, "I didn't have the resources or support. Yes, I understand that we're shipping insecure code, I know how to find secure code, but I don't actually have time in my day and my boss does not let me do that, to actually do that behavior." The last thing is "I just didn't want to," this is the part that breaks my heart. There is an absolute subset of people who are, "You're not going to make me do security, that's just not my thing."

The first thing is, even if any of the bottom four are true, if you can solve a behavior with a technology, you should absolutely do it because human beings are sometimes messy and very dynamic and getting them to care about something on an ongoing basis is tricky. If something is happening and it can have a technology solution, first and foremost, please solve it with technology.

One of my favorite use cases here is driving secure-password practices. Please don't share passwords across sites, for example, or please have long and non-reused passwords across everything you use, and never give it up.

Some of these things are actually impossible for us to do, and we're going to dive into that in a little bit. If you can do things like a password manager, you should just use a password manager, that is a way better solution of getting people to care about password practices.

The other categories have different approaches to it. This is where awareness campaign comes in, this is where you absolutely should do communications. I didn't know what to do about it - skills and training. I actually see this approach being overused heavily. People often think that secure coding isn't happening because people don't have the appropriate skills, when in fact they don't have the management support, resources, or maybe the alignment or understand it's part of their role, they think it's somebody else's. Then, "I didn't want to." For a subset of employees, I've seen that it's possible to motivate change through gamification, which we're going to get into in a little bit.

Behavior Change

Now, diving into behavior change. Let's look at the component of, "I didn't want to do security," or, "My organization is not making it worth my time to do it." What are the tools that we have at our disposal to get people to want to do security? This can be applied at an executive level, to drive alignment for managers, or at an individual-contributor level on a daily basis. At a high level, the way that any of us do something differently tomorrow than we do today is because there are three components that exist to change our habits and all three have to exist at the exact same time in order for us to do something differently. We need to be motivated to do that thing. We need to have the ability to do that. We need to have a trigger, something that reminds us to do it.

Let's say I want to go running, I want to run a marathon in a year. First of all, I need other motivation. "Am I doing it to lose weight? Am I trying to be healthier? Am I trying to compete? Did I lose a bet?" Whatever it is, I need to have a motivation to actually want to start running. Then, I need to have the ability, I need to have running shoes and two working legs. No matter how much I care, if I have a broken leg, I'm not running that marathon. The last is, if I don't have something that reminds me, whether or not it's a calendar, invite, or a routine part of my day where, after my lunch break, I go do this. Or, I have a friend who rings my doorbell and says, "It's time to get off the couch, you got to go for a run with me." If those three things do not exist at the same time, I'm not doing that behavior.

The way these all interplay is this awesome relationship where motivation and ability have an inverse relationship to each other. The harder something is, the more motivation is required for us to do it. The easier something is, the less we have to care about this.

Back to the conversation of technology. If technology makes it super easy, I don't actually have to care about this. My organization doesn't have to value this, we don't have to prioritize this if it is super trivial and in line. If something is really hard, I really have to care about it in order for me to do this. This trigger line is what happens when I get reminded about that thing, whether or not it's time to do the code checking or it's a reminder for, "It's time to threat model your feature," for example. If the right combination of motivation and ability do not exist in that moment, then that behavior will fail.

I find this really fascinating, I actually applied this for a whole bunch of other things in my life, including practicing musical instruments, which is really hard. This is true for all of our habits and all of our behaviors and it's based, if you want to read more about this, from a professor out of Stanford University named Dr. BJ Fogg.

I'm just going to spend a little bit of time, we hit upon this a little bit around making a behavior easy to do, specifically security behaviors. One of the behaviors I want my organization to do is have a secure password for all of my sites. The hard thing to do is get everybody to remember that, that's really cognitively actually challenging. Easy, "Please just use a password manager." I don't actually have to care about this because I don't get motivated about it.

Reporting suspicious activity, we're talking about that as being a symbol or an indication of positive security culture. The hard way to do this is get to remember, "How do I forward this to whom?", "What's the right email address?", "Do I include the header or not?", "Do they actually want the attachment?" A thousand iterations. The easy way is installing a "report" button in your email so you can hit a button and it forwards it.

Tailgating is another one. Physical security is part of your security-culture priorities. A hard way to do that is actually start driving social accountability in your organization, so make sure that everyone behind you has a badge and you have to stop them and say, "Do you work here?" That's a really uncomfortable experience for most employees. An easy way to do that is to install physical devices, something called a man-trap, I learned in my career. If you can do that, it just forces that natural behavior.

When I was at Salesforce, we actually didn't own the buildings that we worked in, we just leased them, so we didn't have the physical ability to install technical devices like this. Because of that, I actually didn't have a choice to do the easy solution, I actually had to lean into the hard component of it and say, "Ok, I have to get people to actually buy in to stop any tailgaters behind them. How in the world do I do that?"

This is where we move into this category of, when something is really hard to do, we have to get people to buy into it. This is when we go back to the story of Clara, this is where Clara makes the decision. She's, "The thing you're asking me to do, unless it's super trivial and doesn't take time, if it does take time, if it does take my resources, what makes it worthwhile for me to actually do that behavior?"

When I talk to primarily security audiences - I feel like this is a really important slide - this might be an obvious thing for this audience, but it's important to realize that the majority of employees in a population will not appreciate security to the extent that the security team appreciates security. I know you're all laughing but, security people, that is a new concept for us. We strongly believe that everyone needs to love security about as much as we do, and that's secure mindset if you understand everything that I understand about security. The reason that our companies are successful is because salespeople are great at selling, and marketing people are great at marketing, and security people are great at security, and everyone should be great at their expertise. It's unrealistic to expect someone in sales or in marketing to be great at security.

The biggest aha that I've had in my career the last couple of years was realizing that I can't get somebody to love security the way I love security. They will just never be motivated about the outcome of secure code the way that I am motivated about it. What I can do is tie it into something they already care about, because, as human beings, we're already wired with motivations that drive us. If we can tie security to the things that motivate us, we can get the desired outcomes that we're looking for from the security perspective.

What Motivates Us?

There are two types of buckets of things that motivate us as human beings. There's intrinsic reward and extrinsic reward. Extrinsic is things, again, external to us, like money, fear of punishment, or reward. In his book "Drive," Daniel Pink said, "For a long time, we have something called "Motivation 2.0" where there's a belief that we're just lazy couch potatoes, and if we don't either get beaten with a stick or have a large monetary reward at the other end of it, we just lay there and do nothing. Maybe would've worked for the industrial era, and it got us that far, but that's not really how we're motivated these days, especially as knowledge workers. This moves us into a concept he calls "Motivation 3.0," which is tapping into our intrinsic motivation, doing something for the love of it because it gives us a sense of purpose, of mastery, we have a feedback and autonomy, which all, as a side note, are some of the reasons why we love to gain. Great games all tie into those behaviors because it ties into something deeper than us, not necessary because we're paid to do that.

Extrinsic motivations tend to actually be less impactful and have less duration than intrinsic motivation. Doing something because I care about it, because I believe in it, because the outcome is important to me.

Here are some examples of intrinsic motivations that I've seen and applied in my career. Status is something like leaderboards, or letting people appreciate the fact that they are the top performer in a specific category. When we talk about, let's say, secure-development practices, the thing we really want to drive is - this is just an example, one I've seen in my career - making sure that all features that require security approval, like really critical features, go through threat modeling. You can have statuses of streaks, organizations that have perfect threat-modeling records, for example. All of a sudden, as an employee who's being asked to do this, I know what happens when I don't do it - I get punished. When I do do it, I get a certain element of recognition, in this case, being a leaderboard.

We can move into competition. Competition shows up things like bug bounties or capture the flags, which again, take a concept of a behavior we would really like, which is removing bugs in our code and making it more fun. Some people just love the game of it, some people like to win but the joy of the game itself is great.

The next is altruism. The story that I'd really like to share here is a study that the Red Cross did. The Red Cross really wanted to drive the number of donors that had given blood donations. What they first started doing was saying, "We'll pay you $50 if you come give us blood." What they found is that that barely worked. They were like, "Maybe we're not paying them enough money." Then they said, "Ok, $100. $100 got to be it." Didn't move the needle pretty much at all. They were, "We can't really give people that much more money, we're going to be losing money on blood drives. What if we just took the money out of the equation altogether and lean into altruism?" They ran this campaign that said, "If you'd really like to lend a hand, lend an arm." Their numbers skyrocketed. What they found was that people really, for whatever reason, wanted to feel good about themselves and found that donating blood was a way to feel like they're making an impact. Giving money in exchange for that actually cheapened that experience for them.

In my personal experience, I've found that, when paying people to do things like bug bounties, unless you're paying people a significant amount of money, tying into any one of these other things is actually way more impactful, unless there's a million dollars on the table.

The next thing is access. In my career, I've seen access show up as lock-picking classes that only have 10 seats. Only people who have a specific level of achievement for doing something that, again, is a positive behavior in our organization they have access to. The airline industry has figured out access is a motivation to a tea. How many of us have routed our flights through airports so we can board a little bit earlier on the red carpet versus the blue one to have specific access to a lounge? It's a level of privilege and status, but they've really figured out access as a motivator.

The last one is achievement. Achievement can show up as everything from a certificate, to a kudos email from a management, to an all-hands shout out by an executive. If you think about this as driving the behaviors we really want to see in our organization, whether or not it is passing our audits, or meeting our policies, or getting our results [inaudible 00:39:58] patching. You can show who's been achieving this from an executive perspective.

The Power of Social Proof

Another tool that I've used in my career is one called social proof. Social proof is used at us all the time from the reason we buy things on Amazon, "A five-star review, it must be great and I should probably buy that," to celebrity social proof, "Someone I know and admire, maybe not Kourtney Kardashian, but someone who's a role model is clearly enjoying using this lipstick so I must enjoy it too." There's also the psychology of, "10 other people are looking at the same room as you on this hotel site." Same concept, "This must be desirable."

I've seen this used really interestingly and effectively in security for a research paper that was done at Georgia Institute of Technology by Sauvik Das who partnered with Facebook and he found that, when he had a control statement that said, "You can keep your accounts safe, please install 2FA," the subset of people turned on 2FA. When he said, "Did you know that 108 of your friends have also turned on 2FA?" they found that it had 1.36 times more effect in getting people to adopt that security feature.

What we've done and the work that we do today is take a specific security behavior, like compromised rates of phishing, or malware-infection rates, or secure-coding practices, and compare every person's individual performance to their team, to their company, and to a global benchmark if we have it, and if they need to improve, we tell them, "Did you know you're actually 1.2 times more likely to compromise yourself than people in your department?" We have found this to be incredibly effective at shifting behavior. Some people like to compete, some people just don't like to lose.

We also use things like social proof. This was a partnership we had with Autodesk and we were trying to drive password-manager adoption. The CEO of Autodesk uses a password manager and we were able to call out and say, "Did you know that your CEO uses a password manager?" Using communications like this, we had a 26% increase in adoption of a password manager in 48 hours across their global population.

We've also applied things like gamification to our communications of driving security behaviors. You can see, on the left-hand side, there's an element of status, "You're just at this level, wouldn't it be great if you were up to Indestructible?" and then, give people clear recommendations on how they can improve their status. Again, we use elements of badges, which are achievements. Some people don't really care about badges and I think, by themselves, badges can fall flat. Badges as a form of currency that can tie into things like access, so five badges get you access to a lock-picking class or I've seen some government agencies who struggle with parking lot, so access to the VIP parking spot. Or cupcakes for your team, something that you potentially can't just buy on your own. Achievement as a currency.

Then competition and comparisons to people who are like you and to show you who's just doing a little bit better and who you're doing a little bit better to drive incentive and behaviors in the space.

Takeaways

Wrapping it up, what I would like to leave you with are a set of tools and understanding that, first and foremost, each of us have a security culture. Every one of our employees lean into making secure decision or back away based on the things that get rewarded in our cultures and punished. It would be wise, if this is a conversation that is of interest to you, to understand first why people aren't doing it. You'll understand, "You didn't ship code on time. Why?" All the way down to the place where you realize people don't care, they don't have the tools, they don't have the management align the resources, or they're just not recognized in doing so. Hopefully, through this presentation you will have seen a variety of tools you can now add to your tool belt to start improving the culture and the ongoing nudges that you have for each and every one of your employees so that none of us will ever get to be a headline. Hopefully.

 

See more presentations with transcripts

 

Recorded at:

Jan 02, 2020

BT