BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Cloud DevSecOps in Practice: People, Processes and Tools

Cloud DevSecOps in Practice: People, Processes and Tools

Bookmarks
47:12

Summary

The panelists discuss how to get the right security, DevOps, and cloud engineering stakeholders together to build a realistic DevSecOps strategy.

Bio

Pierre Vincent is Head of SRE @weareglofox. Stefania Chaplin is CEO @DevStefOps. Barak Schoster is Chief Architect @Bridgecrew. Omkar Hiremath is Cybersecurity Team Lead @Gradient Cyber. Renato Losio is Principal Cloud Architect @funambol.

About the conference

InfoQ Live is a virtual event designed for you, the modern software practitioner. Take part in facilitated sessions with world-class practitioners. Hear from software leaders at our optional InfoQ Roundtables.

Transcript

Losio: We are going to discuss cloud DevOps, and almost all cloud DevSecOps. What it means in terms of people, processes, and tools.

Just a very quick introduction on the topic and what we mean by DevSecOps in practice. With new cloud native methodologies such as infrastructure as a code, GitOps, and whatever else, we keep talking about shifting left, is now the new buzzword probably, and we love it. We'll explore how to get the right security, and what cloud engineering, and what the stakeholder, what the developer, what the other people in the engineering team have to do to build their realistic DevSecOps strategy. The panel will discuss the nitty-gritty of the implementation of a cloud security automation.

The Journey to Cloud DevSecOps, and Background Info

My name is Renato Losio. I'm an InfoQ editor, and I'm a cloud architect, mainly working on AWS stuff. We are joined by four panelists, all security experts in DevSecOps. We'll start with a very quick introduction of the four panelists: who you are, and what has been your journey to Cloud DevSecOps.

Chaplin: My name is Stefania Chaplin. I used to be a developer. I know a lot about DevSecOps. I've worked for a couple of cybersecurity vendors, such as Sonatype, Secure Code Warrior, moving to Gitlab shortly. I come at it from a very practitioner point of view. I've got AWS cloud training, but always security first.

Schoster: My name is Barak Schoster. My journey actually started as a data scientist. When I tried to operate and take data science models into production, I discovered a lot of stability issues. I found my way into DevOps and cloud architecture. From there, I founded Bridgecrew, which is a DevSecOps company.

Vincent: My name is Pierre Vincent. I'm head of SRE at a company called Glofox. We're not a security company. We're a SaaS platform in the fitness industry. I started as a developer, like a decade of development team, and all of that. One day I started going on the other side, and realizing that it's actually way more interesting to look at how things really run in production, and into the DevOps world, which really doesn't exist without the security aspect. This is where my professional life is now.

Hiremath: My name is Omkar Hiremath. I'm working as a cyber-security team lead at Gradient Cyber. I mostly work in the security part of DevSecOps and sometimes in the dev part. I focus on research and detection of malicious and suspicious activity using TTPs and AI. If I had to tell my part in the DevSecOps stories, I believe I'm just done with setting up the plot as it would be in a movie, and now I'm moving forward.

What Is DevSecOps and What Is Not?

Losio: I would like to really start with the basics. What is DevSecOps and what is not? What is SecDevOps?

Chaplin: The attitude has changed a bit in the last few years. We used to think, you've got your development team, and then they'll create a deployable, and then you'll do a security test. Then it's over to operations. Historically, people saw, security is just the little scan in the middle, maybe as part of your Jenkins CI build. What it actually is, is integrating security within every step. For example, are we doing threat modeling, are we doing SAST? Are we doing logging and monitoring? Normally, we think about web applications. The exact same is true for the cloud, for example, when you're creating Terraform scripts, are you scanning those? Are you checking what they are? It's very important to just have validation at each step, because things change, and there's a lot of attack vectors, because everything's getting a lot more complex as we move to the cloud.

Schoster: I think DevSecOps is a cultural movement. Shifting left is actually a term that was coined by QA, when QA engineers have started to automate testing of applications and data centers. In security, we had a similar process, automating all of this stuff that Stefania has mentioned before. Because doing it earlier is actually solving some productivity issues, and also some tension between different teams, between the security team. Actually it's like domain expert QA in some cases, and the development team which is responsible for the application, and the Ops team, obviously, responsible to keep it up and running in production.

Getting Started with Security

Losio: I actually have a follow-up on that. Until a few years ago, we were not even using the term, we were just talking about DevOps, and security was something that, let's hope for the best, more or less.

Vincent: I'm pretty sure it's still like that.

Losio: Let's say that I'm a developer, and I'm coming from that DevOps background that I've been playing maybe for a few years on the cloud, and everything looks great. I haven't really paid attention to security, not because I'm a bad guy, but because I've been focused on other tasks. I've been playing with maybe even elasticity, or other part of the entire cloud story, but not the security part. Where should I start? Pierre you mentioned you're coming from a developer background, do you have any advice for a developer who is curious about the topic? Or we say that we should start to worry about, so let's start to worry about?

Vincent: If you're starting now is just even to look at, how did we progress that phase well in the other domains? Barak you were talking about testing, this is where if you look back 10, 15 years ago, testing was still fringe. Now you can't really conceive something that has no unit tests. It's embedded as a practice. DevOps on its own is getting there now. It's standard. You go to a company, if you don't have CI/CD, if you don't have pipelines, if you don't have developer experience, that type of stuff, that's starting to be a smell. Security I think is still lagging. That's why he was trying to do DevSecOps. As a developer, it's just starting to look at, what are the practices now that are being done? What are the types of scanning that you can have? What is the information that you're missing when you're developing? The tests give you some information about the quality of your code. How do you get that information without the level of friction that we were talking about like, testing at the very end, or just hoping for that yearly pen test to tell you everything? Which is usually just a compliance checklist type of thing.

It's looking at that static analysis. What are the tools? What's making it easy for you to make the decision early in the process? What is making it easy for you to have the information? Is it a tool? Is it just talking to the right people? Is it involving some of the people early on in the process? If you're looking at the security problems, I think, usually developers have a fairly keen sense of what's problematic in their systems, security-wise. It's a fallacy that developers don't care about security. It's usually, they're not incentivized to care about it. If they're given the time, go and figure out what's interesting to look at, get some information, and derive some of the tools. You're a developer, this has to come from you as well. It won't be literally coming from your CTO who just doesn't have necessarily that priority.

In the DevOps and the SRE world where we're trying to improve developer experience, the best way we have to get adoption for a new a practice or a new tool, for example, is if it comes from the developer. If it comes from me saying, here, I'm adding some new scans in your pipeline, they'll be like, "Yes, great." If it's a developer or a team leader comes in is like, "There is this awesome stuff. I was working with that before," just selling it from the inside. You're a developer and you want to care about security, do this. Do it from the inside. You'll have a way higher chance that this is going to stick. It makes my job way easier, because I don't have to do this.

Losio: You're basically saying it's really a culture change as well. Because basically, the way you mentioned the yearly penetration test or whatever, it basically has always been the approach of like, ok, I have my product. Here is the development team, the old style approach. It's probably been, I care about security as a company, but I give it to someone else. It's like yearly, or whatever else, someone else is certifying my product or my deployment or whatever else. I'm not saying that's not valuable, but there's great value there. Then it's usually coming back in terms of bugs, Jira, whatever you're using, and the development team is going to act on that. It's not really taking the initiative on that. It's just, here is the list of what you have to do, and addressing it.

Collaboration between Security Experts and Dev

I don't know what's your experience, for example, coming from a different point of view? Maybe Omkar, do you have any feedback on that? What's the relation between a pure developer and a security expert, and how they should work together in some way?

Hiremath: I think at the foundation level, every developer thinks about security, but not to an extent that they should. Whenever a developer is building something, they have these foundational things that they do implement in the code. That's not enough, because it's just basic. With the capabilities that we have right now out there with the C2 servers, with tools, and so many threat actors, it's bad. Because on a day to day basis, I see so many attempts to hack or to harm a system or a network. It's really bad. We have this thing that we say that a defender has to be successful every time, but an attacker has to be successful just once. If you're a developer who hasn't taken security very seriously, then you're late to the party, but you're not too late, because you can still pick up security. My advice would be to start understanding what threats your application or your company would attract. For example, if it's a finance company, the threat could be a malicious fund transfer, or if you have critical data for the company, you could be attracting ransomware. When you understand the threats that you attract, you can then break it down to the components that are more susceptible to these threats, and then go from there, see where you stand and see how we can improve those stuff.

Schoster: Everybody mentioned when you have a security issue, you have it in your backlog, and you need to solve it, or you need to encourage your peers to solve it. I think that I realized for the first time that I need to solve it as a junior engineer a few years ago, when I received a PagerDuty alert in the middle of the night on a bug that I pushed into an environment. It was not fun. I really like to sleep at night. I don't like to create reliability issues in production. My mentor at the same time, as a junior software engineer, told me, how could you have known about it before? This process of testing before, not only pen testing, but also unit testing, integration tests, and finding the right automation tool to check security very early on, helped me prevent further PagerDuty calls to wake me up in the middle of the night.

Chaplin: I speak to developers quite a lot. It's all about understanding your audience. Developers are driven by different things, security or Ops, cloud. Developers, I just ask, do you like rework, or in Barak's example, do you like waking up in the middle of the night with PagerDuty notifications? I don't think anyone genuinely will say, yes. Actually, I like to deploy, and then go to bed or finish early on a Friday. When it comes to developers, it's about positioning this. A lot of the time, it's not just the developers you need to target, it's also the managers, because how a developer is being incentivized is usually functions, features. Security is just a quality test. Like we were saying, testing used to be in the background.

I remember when I was at Uni, and it was like a Gantt chart, and security and testing were all the way over there. Then the project is over budget, it's past the deadline, and they both get ignored. Now testing is intrinsic. I'm hoping I think security is starting to come that way. Fortunately, or unfortunately, it is usually the developers that have to fix the problem, because testing is great. I love testing, and you definitely need it. You can't test your way to the solution. Testing can help you find the problem, then you need to make sure that you incentivize developers and give them the time and potentially the training as well to be able to fix these security vulnerabilities before they go live.

Hiremath: I agree with Stefania, that with tests and everything, you find out these bugs, but again, it only matters when you fix them. Because if you find out a bug, and it's still there, it makes no difference. A lot of times, developers or the team leads have a high level idea of what has to be done. Again, that's not enough because you have to get into the code. You have to get to the specific. Training and that security awareness of what can happen if you don't do this, and what are different ways something can be exploited, that's really important.

Incentivizing Separate Teams to Embrace Security

Losio: I was thinking actually, we tend to say that everyone basically should be at least responsible for security inside a team. In reality, in many companies, in many deployments, in many scenarios, both for lack of knowledge, or as well for structure of team or whatever, there are still some quite different teams. It's like this team is responsible for security that it works is the concept of that. It's almost like if the other ones are not responsible for security, that's their problem. It's the whole problem of QA. They do QA, so we don't do any tests. Sometimes there's a bit of trying to have a wall between different teams. As you mentioned before, the role of the engineering manager, I was wondering if I manage a team where they're really separated, if you jump in a company where there's really not a common shared responsibility for security, what should be the approach? What should be the next step, really an action item? What should you do, move a developer to work in parallel with a security expert or programming for everyone?

Chaplin: I think it comes back a little bit to what I was saying in terms of, do you like rework? You just have to put a positive spin on security because the reality is, even if you're only doing your pen test once a year, which I hope everyone does it more than that, but when you have these problems, it's like, do you like having a job? Do you like our company not being on the front page of the news and being hacked and all our source code on the internet? I think especially with GDPR, that now that you have to disclose and you're seeing ransomware and all these other things, there's a lot more accountability. Everyone feels more like, actually, security might be a good idea. It's really about having that positive spin. Can you incentivize like, everyone who manages a build-through with no red flashing vulnerabilities, you can expense a coffee. Something silly like that. Using psychology, but putting a positive spin on security. Having that building team method, so that at least then you can address the solution together. Because if you are working with a team and everyone's siloed, and no one cares about security, and it might be a culture of fear, and everyone points the finger, that's not productive. It's really about taking a step back, being more pragmatic, and trying to solve things together. Because we do have our different silos, and everyone cares about different things. Fundamentally, most people are trying to do good for the company. Taking a more pragmatic approach, and using incentivization if necessary, can be a good first step before you start to tackle the security at hand.

Vincent: There's a thing that I've been doing with a few of the teams, over the years, more recently is, talking about the pen test, and just taking the pen test example to maybe educate people or train people or get people more interested in that space. I'm not talking about external pen test. I'm talking about game days internally. What we do every month in Glofox, is we take the SRE team and we take a group of people in development teams, and we pick an area of the product, and we just go and say, let's try and be the hackers today. Then look at it from that perspective. This is so much more a deep dive than a pen test, because you actually get people with insider knowledge already. You get some senior people that have been there for five, six years, they know where the skeletons are, they know where they legacy area is, where all the bugs always pop up and stuff.

What's cool from those exercises is you end up getting people having that roller coaster moment of, yes, we found something. You never really forget that moment, because it's like, ok, it would be possible for somebody to exploit this. We found this first, but that means we were able to find something. It looks like it's a pen test that's at the very right of the process, but it actually shifts the mentality very much left for everybody that was there that day, because now they're coming back. They're going to be thinking, this shouldn't have been that easy for somebody to do this. Security is not about making things perfect. It's about making things hard for the attacker. How could I have made that a little bit harder? Because when you do the pen test, it's like, that was nice and easy, because I could escalate privileges this way. It's been working out really well. Not only do you get to tackle some security bugs, people are quite engaged into fixing them, because they've been the ones raising them. They can go and fix it themselves. They come back with the mentality.

Losio: It's ownership basically. They own the problem.

Vincent: It's not somebody else logging the Jira for them.

Effective Practices for Dev Teams

Losio: What practices are most effective for dev teams? It's actually suggesting as well vulnerability scanning, inspection. I had the feeling from Pierre's answer as well, there's the topic of, ask the team to do part of the work, just give the ownership of the problem. Any feedback on the most effective way for dev teams to try to address the problem?

Schoster: One thing is a cultural experience that I had. A few years ago, I met a security engineer that had just onboarded at the company. He started interviewing people, trying to get familiar with the application, the teams and how processes are working. He hadn't asked any question about the security posture, but he asked one genius question that made it all very simple. He asked each engineer, are you proud in the product that you've built? What parts of it are you not proud of? Then it was like a confession session. Everybody confessed on those things, told the security engineer where we moved a little bit too fast and forgot to go back and fix some of the open issues that we had. He discovered amazing things without doing a pen test, without waiting a few months to learn the system. On the first week, he had identified a lot of awesome stuff that we should have fixed before. It's really about being open as the culture in the company, and being able to talk about what went good and what went wrong. That's the first thing that I would start with.

As for actually securing the application from a tooling perspective, there are a few layers. There is infrastructure layer, versus security or Terraform, CloudFormation, Kubernetes, and cloud provider. There is the application layer where you start to secure your OS, Docker image, and even the dependencies of open source that you're using. There is the runtime part: web application firewall, RASP, and other tools to help to secure your running application and not only your dependencies and code you're using. That's how I would start.

Kubernetes and DevSecOps

Losio: I think it's interesting, you mentioned Kubernetes. I was wondering, how much is Kubernetes part of the process, or part of the problem, or part of the solution?

Hiremath: When it comes to security, you have to be quick. It's not a thing that you do once a year or once a week. Even if you're doing it one day later, you put a lot of things at harm. It's a continuous process, you have to keep monitoring stuff continuously, updating continuously. The Kubernetes architecture is very suitable for DevSecOps when it comes to automation scaling. It is important to rely on scanners and inspections. There's also a lot of open source intelligence out there, that would not take a lot of your resources to research on. At my workplace, we have alerts set up, so whenever there's a new CVE that comes over, we get an alert on the Slack channel. It's just a matter of a minute but you see whether this is applicable to you or not, and then you move on. If it is applicable, then you've identified an issue very quickly and not wait for a scan. There's a lot of room for automation. I think Kubernetes is a good tool to get this DevSecOps going on.

Chaplin: I think with Kubernetes, and with security, in general, you are only as strong as your weakest link. With Kubernetes all of a sudden, that's a lot of links, because you've got to think about your actual containers. You can have bloated base images. You've got to think about permissions. You've got to think about these containers, how they're operating in pods. Where is Kubernetes? Is it in the cloud? Is your cloud secure? Kubernetes is awesome, but from a security standpoint, the cloud and Kubernetes just adds complexity. It's all about going back to basics. Who can do what? How can they do it? That's how I would like to think, doing a security audit. Especially with stuff like additional libraries, are you maintaining your base images? Is it as simple as possible? Are you scanning them as well, and are you scanning them regularly? I think it's really about just trying to do it as securely as possible, because it is increasing your number of links. If they're all strong, that's great, but it only takes one, and then you can be in trouble.

Schoster: Kubernetes introduced a common way to tackle problems that we all have encountered before, like scale issues. We had those scale issues, even when the internet was invented. Kubernetes is just a common layer to add on scale and logging and networking policies. I think that both Kubernetes and the cloud providers, since it's either defined in a DSL, domain specific language, like YAML annotations or API calls, for the first time it allows us to automate those processes in a common way. Before Kubernetes, for example, if we wanted to configure networking between two applications, we had to talk with five different teams. Now everybody is working on the same manifest. It really allows us to have a common communication there as teams. Then on top of that, automation.

Security Tooling for Developers

Losio: We now almost take for granted that people are on public cloud that they are in some way or other using Kubernetes. If they're not using Kubernetes, they're using probably serverless, or at least we all pretend to be in one of the two camps, because otherwise you look too old. I'm a developer, we are saying that developers should be empowered in taking care of security. The cultural shift is important. I'm a developer. I'm used to having my development stuff. I know the API of the cloud provider I'm using. I know how to manage tools or leverage in development. Is there any tool or anything I should get familiar with, any language apart from YAML? Any suggestion, thinking in mind that I'm a developer, not a security expert?

Hiremath: There are a lot of tools out there, which would help a developer out, be it checking if your Kubernetes cluster is up to standard, or checking if your tech stack is up to date, or checking if it follows a particular benchmark. There are tools like Kubesec, kube-bench, and TUF, which is for update framework. All of these tools would only help you to a certain extent, because if it's out there, you know them, and even the attackers know them. Apart from if you're a script kiddie, you would already be researching for ways to get over these. I think, these would be the first line of defense, but they shouldn't be the only line of defense. Every case is different. It should be what you check, what benchmarks you set, what framework you use should be custom to your particular use case. As a standard, you could also use this framework from NIST, which gives very good benchmarks and standards that you could start with. There's a lot of things out there. Again, you have to add those layers, but also have some custom things set up for yourself.

Reactive Programming

Losio: You have talked about reactive programming and security issues, I get pending test and didn't know how to solve it.

Chaplin: I think it means what do you think of being reactive as a [inaudible 00:33:02]? When you think of what reactive work is, it's just unplanned work. It's like something goes wrong, someone else does something wrong, or something's happened, we have to react. You're basically firefighting. When that happens, you will need to deal with the issue at hand. You also need to think about being proactive. For example, if you have a problem with Kubernetes, how are you going to educate yourself? What are you going to do to learn so that this doesn't happen again? To Barak's other example, speaking to your mentor. How do we make sure this doesn't happen? There are some great resources. I think it's called Kube-Goat, which was for web applications. If you're a developer and you like hands-on experience, try and hack an intentionally vulnerable Kubernetes application. If you aren't, understanding your test results, OWASP is awesome for not only understanding vulnerabilities, but also for free tools out there. There's a wealth of stuff on the internet out there. If you are having problems solving it, I'm not telling you to go to Stack Overflow, because that's not technical advice. Definitely within different areas of the internet, there should be areas that can help you. Then focus on the more proactive side to level up your security levels so that when these results come in you can react faster.

Kubernetes Security

Vincent: I just wanted to bounce back on the Kubernetes security stuff that Omkar was talking about. Maybe I'm just thinking slightly differently on this. I don't think that we should be in a place that developers really have to think about Kubernetes. Kubernetes is a platform for platforms. If we have a platform team that enables developers to deliver stuff easily, quickly, and sustainably. Then we should actually build a platform that gets all of that stuff out of the way for them, and get them to focus on the application security, those layers that they understand the most. They have the context about the product. They have the context about the code. If we get all those automation tools, they're relevant, but they might be more relevant for a platform team or an SRE team, giving capabilities for the teams, and then for them to do that safely.

Let's say, if we don't want to have very permissive ways for developers to define their kubectl, or their deployment YAML files, and things like that, is instead looking at it from, I'm a developer. What I want to do is I don't want to write YAML to deploy my service to Kubernetes. What I want is, this is my commit. I want to deploy to this environment with this configuration. That's the only information that I should have to give. Not potentially like a whole written YAML file of security problems, because it's like, now I can attach whatever security account I want, or I can attach whatever volume I want. It's like, no, you don't, because you can't edit that file. We give you the capability to deploy, and then Kubernetes and all those tools, they become relevant potentially much more lower down the stack for somebody to do it.

I suppose there might be some stuff that comes over the next few years where maybe this is going to be done by Google or by Amazon, and we start again, starting to have that powerful abstraction, that actually all the complexity is then by somebody else. Because that's the idea of moving up the stack as much as we can. I've used Kubernetes before. I'm using ECS now, so ECS and Fargate. It's actually refreshing that there's a lot less things you can think of, because then it's like there is less things that can go wrong. It's freeing for development teams that they don't have any extra choice or extra things to think about. You can't get it wrong if you can't change it. It's not a concern that goes away, but maybe if you can remove that from the checklist of the developers then they don't do that.

Losio: You're basically shifting in a paradigm that is the shared responsibility model of the cloud provider, or whatever else. You're basically saying that part of it is hopefully in the future moving to their responsibility more than to your responsibility as a developer.

Vincent: Yes. That's it. If you look at AWS security, there's two sides, there's stuff AWS deals with, and there's stuff that you have to deal with. The more we can shift towards AWS deals with, or Google, or whatever, the happier. I definitely don't know what it is, that means you still need to be very responsible about your IAM policies, your whatever. Make it as simple as possible.

DevSecOps in the Future

Losio: What do you expect in five years, if we're still going to even talk about the entire topic of DevSecOps or whatever? In which direction are we moving?

Hiremath: I think people have started to recognize the importance of DevSecOps, and also SecDevOps. It again, depends on what your use cases. It depends on what's more feasible for you. Security is something that is going to be there forever. You cannot compromise with that. Again, I think in maybe 5 years, 10 years, DevSecOps is still going to be there, maybe a different version of it, an advanced version with more integration, with more teams, or maybe under a different term. The base pillars of DevSecOps is still going to be there. All those three parts of DevSecOps is something that is very crucial, and also, we cannot compromise with that.

Where Security Adoption Is

Losio: I find it even now hard to know where we are, in the sense that I'm pretty sure with security, it's one of the topics that is hard, because if I go to 10 different companies, and I ask, is everyone responsible for security in your company? I'm pretty sure that everyone is going to say yes, even if reality is probably very different. Where are we now? In terms of adoption, where are we at the moment?

Schoster: We are in the early mass adoption. Because digital transformation is a process that a lot of the bigger enterprises are still going through. I think that a lot of the companies going through this process of digital transformation are already getting themselves familiar with educational pieces, like Kubernetes Goat, and there is even other more specified ones like TerraGoat, CloudFormation Goat, CdkGoat. It depends on the infrastructure that you're using. People are educating themselves. They're starting to use open source tools like kube-bench, OPA, Checkov. I think for the mature companies, one of the models that I've seen is actually going through GitOps as the methodology and service catalog, which is what Pierre had mentioned. SRE teams are building common services to be used by different application teams. Everybody knows how the company database should look like. Everybody knows how the company web application server should look like. SRE teams are actually creating service catalogs with predefined best practices within them. Each application developer does not need to care about logging, monitoring, and even security and high availability stuff, because those are already solved. I think that Spotify has an awesome project on service catalogs that I would recommend the more mature teams to check out.

Resources

Losio: What resources should the developer check out to get up to speed with these issues? Any book, blog recommendation?

Chaplin: There's so much out there. It depends who you are. If you're a developer, I've worked with a lot. I used to be one developer, hands-on experience, from what I know. Like Barak was saying, all the different goats. Why not try and hack? OWASP has some great resources, there's like [inaudible 00:42:24] as well. If you prefer reading, if you want book, blog recommendations, one of my favorites, and it's a bit old school, but if you want to talk about DevOps, you can read The Phoenix Project, just because it's really good. I actually read it for the first time during lockdown. I was like, I'm just going to read some light reading. Because it's written from the perspective of an Ops practitioner, it's a really gripping read. I read it in two days. I was like, "This is amazing." Even just that, it gives you an understanding.

Back to the earlier question where we are with DevOps. I think in terms of using the words, I think it's getting fairly mainstream. In terms of people actually doing DevOps, I think we're definitely earlier on, because we're still fairly siloed. Processes can be slow and bureaucratic. I think we've got a way to go. In terms of a good blog, read the Phoenix Project, check out stuff on OWASP. If you'd like to have a play around, go after some goats.

Vincent: Maybe in the same vein, but it's maybe sometimes just broadening your thinking on the security side. I love the pen testing aspect, and the game of pen testing, and in certain ways, like offensive security type stuff. It's nerdy, but it's just so much fun. Even if you don't go to paid services, you can check out Hack The Box, those types of services that are so much fun. Literally, you're given VMs, given APs, you need to crack them. I think, even if sometimes it's not real life type of scenarios, it does open your thinking, and just going into that mindset of, when things are easy, we make it easy for hackers. When you're pen testing you're delighted when things are easy. It's like, how do we make it so that there's extra barriers so that we get people to give up? I think, from my journey in the security space, offensive security is what really opened my mind a lot more than any books or any list of Jiras or any reports.

Hiremath: I'm not a book person. I wouldn't like reading books to understand something. I'm a person who wants to see stuff on my screen. Initially, I did try reading books, but I didn't enjoy that as much. Like Pierre mentioned, there are a lot of applications out there. If it's something that you're in the beginning stage, you could set up some VMs on a system, build stuff, and then look at it from both sides: the way you're building it, and then you're trying to break it as well. That mindset is important. That openness to the thing I've built can also be broken this way, would make a developer more mature, because he sees a different perspective of what can happen with the application. That's what I'd want.

Schoster: I actually like to consume content via video. I would recommend that you check out re:Invent videos, re:Inforce, the CNCF, KubeCon videos, there are some really hidden gold out there about best practices, both on DevOps and security, and also fwd:cloudsec, which is a specialized one, on AWS. There is very good stuff out there. Also, a big shout to all of the other stuff that the people mentioned, like Phoenix Project, pen test. Those are amazing resources that I'm a fan of too.

 

See more presentations with transcripts

 

Recorded at:

Mar 21, 2022

BT