Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Improving Security Practices in the Cloud Age: Q&A With Christopher Gerg

Improving Security Practices in the Cloud Age: Q&A With Christopher Gerg

Leia em Português

Key Takeaways

  • Leverage some of the service offerings at the cloud provider, and recognize that the security fundamentals—network segregation, patches and updates, monitoring and alerting, authentication and authorization, encryption, anti-malware—stay the same. 
  • Good security needs to be built in from the start and not bolted on after the fact.
  • Developers need to think like hackers and do vigorous imput validation along with proactive security analysis.
  • Don't fall in love with advanced security tools if you haven't mastered the fundamentals like patch management.

Developers and IT leaders say that security is a top priority. Survey after survey shows that it’s easy to say, and hard to do. The 2019 DevSecOps Community Report says that nearly half of respondents can’t find time to embed security practices into their software development lifecycle. GitLab’s 2019 Global Developer Report highlights that 49% of security professionals can’t get developers to prioritize vulnerability remediation. Fresh research from the Enterprise Strategy Group points out that 83% of respondents are worried about misuse of privileged accounts, and 35% believe multiple security controls are leading to increased IT complexity.

To shine a light on effective security practices in the cloud age, InfoQ spoke with Christopher Gerg, the CISO at Gillware, a data recovery and digital forensics company.

InfoQ: As organizations extend their architecture to public cloud, what should they keep doing, start doing, and stop doing?

Gerg: I encourage the move to the cloud – it CAN be a significant cost savings for some use cases.  There are some things that you must keep in mind, though:

Remember that at the end of the day it is someone else’s computers in someone else’s data center running on someone else’s network infrastructure. Compliance and data confidentiality can be a challenge. For example, if you are a HIPAA-relevant organization you need to encrypt not only the data when it is stored (at rest) but also when it is transmitted between servers in your cloud environment. Beyond that, you should take the same approach you do with your servers and services – authenticate users in the most robust way possible and authorize them to access the services and data necessary to do their business, but no more (the principle of "least privilege"). While the cloud providers do a good job of ensuring that things are available and handling the details of the hardware and physical parts of the server environment, securing the data and services is exactly the same whether it is in "the cloud" or a data center.

You pay for only what you use. This should help you decide which workloads to migrate first.  If you have a large data analysis effort that runs for several days, then spin up the resources you need to do that work and then have the discipline to shut those resources down when they are not necessary. This is where using the cloud to host your servers and services starts making very good sense. Consider using containerization and robust orchestration mechanisms like Kubernetes to spin up capacity based on demand (and turn it back down to lower levels based on lower demand).

Leverage some of the service offerings at the cloud provider – don’t just forklift your virtual machines up to the cloud provider. Continuing the discussion of containerized services and orchestration, you don’t need to run full servers to run your business. It does require some rethinking, but "serverless" or containerized services allow you to move the layer of abstraction further up the stack – making it easier to use less resources and thus pay less for the same (or probably much more) capability.

The security rules don’t change. Network segregation, patches and updates, monitoring and alerting, authentication and authorization, encryption, anti-malware, etc., are all still relevant.  It does make the need for robust analysis of how the cloud provider is protecting your services and data if you’ve moved things to their cloud.  DO NOT JUST ASSUME THEIR TAKING CARE OF IT.

InfoQ: You mention that cloud users should take advantage of what's offered, but remember that the "security rules don't change" from what's always been relevant. Do you see companies experience challenges figuring out who is responsible for what amongst these services? Some cloud services patch infrastructure, some don't. Some have wider attack surfaces than others. How do you recommend that companies reconcile the distinctions?

Gerg:  Absolutely – I didn’t say it was necessarily easy. :-) Ultimately you need to understand what is your responsibility and what is the responsibility of the service provider (or what configuration changes you need to make with the service involved).  It might be that the service is not an option because it can’t support your compliance obligations or address the risks to your satisfaction.  You can’t just "throw it over the fence" and assume it’s taken care of. You need a good, visceral understanding of how the service works.  A good example is the hosted Kubernetes services provided by the different cloud providers.  The solution from AWS is different from the one provided by the Google Cloud in many ways that have a potential impact on your compliance obligations in regard to encryption of the data between "servers."

InfoQ: How do you recommend that companies embed or leverage InfoSec skills on a DevOps-style product team?

Gerg:  Early and often. :-)  An excellent resource to help DevOps organization integrate information security is the DevOps Handbook by Gene Kim, Et al.  The bottom line is that good security needs to be built in from the start and not bolted on after the fact.  Every aspect needs to be thinking about the information security aspects of what they’re doing – having an information security expert take part in the requirements definition phase of development planning, information security checks as part of peer code review, security checklists as part of release processes, etc.  What can help this is the use of automated tools – static code analysis, integrated code analysis tools, automated QA testing that integrates security checks, etc.

InfoQ: What's one thing that software developers need to get better at with regards to security?

Gerg: Firstly, do not trust anything the "user" tells you – input validation is huge.  Also – think like a hacker. Proactive troubleshooting is very helpful – "how can this break", "if I make this change what happens down the line?"  Being aware of at least the OWASP top ten (and other appsec resources at is a good start.  Make application security training a priority for DevOps personnel.

InfoQ: Are there tools or services that you see that help developers do the right thing when it comes to security practices? Do services like API gateways offer default protections that keep downstream services safe even if a developer doesn't apply all the per-service protections?

Gerg: There are some really good (and largely automated) tools that can help devs – there are static code analysis tools, application scanning tools (like Appscan), automated QA testing tools, and even web application firewalls can help mitigate some of the appsec risks.

InfoQ: What's an area of cybersecurity that enterprises need to get much better at? How should they start?

Gerg: Avoid the "MBM degree – Management by Magazine".  Being distracted by sexy tools can make you forget the fundamentals.  I have seen many organizations that have some incredibly capable (and expensive) information security tools in place but don’t have a mechanism to keep their workstations and servers up to date with patches and updates. Taking a risk-based approach to your information security spending (both time and money) is the best approach. Engaging an expert third party to help with the process is advised.

InfoQ: We talked about where enterprises need to get better at security, but are there areas where you think enterprises have over-invested, or prematurely invested before handling fundamentals?

Gerg: I think that organizations get enamored with sexy tools that promise to be the security solution to solve all your needs. A good example is the various SIEM, SOAR, and USM tools out there – it’s not to say they don’t do what they say they will, but dropping many thousands of dollars on a SIEM/SOAR/USM solution doesn’t make sense if you have unpatched workstations or your admin’s password is "password123".

InfoQ: How do you see public cloud helping enterprises be more secure? Or less?

Gerg: Information security is concerned with the CIA triad – Confidentiality, Integrity, and Availability.  Regardless of where the servers/services/data are located, confidentiality and integrity are largely still the responsibility of the organization to make sure that controls are in place. The strongest way that cloud providers help is with availability. Their uptime is astonishing and their ability to replicate across geographies to provide redundancy is excellent.  They do have mechanism that can help with confidentiality and integrity but it takes some planning and discipline to make effective. There is a strong tendency to just throw things up into "the cloud" and forget about securing things because there is not a box with blinking lights sitting in a room up the hall.

About the Author

Christopher Gerg is a cybersecurity professional with over 20 years of experience. He is the CISO and Vice President of Cyber Risk Management at Gillware. Gerg simplifies the complicated intricacies of information security, compliance, and cyber risk management by pragmatically assessing organizations and providing straightforward recommendations for improved security posture. Outside of the office, Gerg enjoys traveling with his family and cheering on the Chelsea Football Club.

Rate this Article