The Virtual Tug of War
Technology professionals have always fought an unrelenting war not dissimilar to famous feud between the Hatfields and McCoys – a continuous conflict with no winners. In the world of IT, this is a battle over security and performance fought by security professionals and network administrators. These two factions have always had to barter and maintain an uneasy truce in organizations in order to survive.
The origins of this performance vs. security battle are quite clear and simple to understand. Networking folks’ prime goal is to move data from point A to point B as quickly as possible, with microseconds often making the difference between acceptable and unacceptable performance. As examples of performance-critical scenarios, think of financial trades that must be carried out instantaneously or money is lost and wired operating rooms where medical procedures are conducted via a network where the doctor and patient may not be in the same room, and are sometimes in different states or countries. At the same time, the amount of video, voice, and data traversing even the most basic of networks grows practically daily. Thus, the need for speed is increasingly becoming a requirement for survival.
On the flip side of the coin, security – while always an issue – has become an even more crucial element that must be baked into a network. In the past, attacks were conducted by weekend hackers simply piercing a network to gain notoriety in the hacking underworld. Today, hacking can be a career path for unscrupulous individuals (be they insiders or traditional outside-in hackers) looking to steal information to sell on the black market for a big payout. The result is security personnel in an ideal world advocate tearing open every packet and inspecting protocols, behaviors and traffic flows in order to head off sophisticated attacks designed to evade security in unsuspecting networks. Each time a packet is opened and inspected, it can result in milliseconds added to transmission times. Consider that packets may be opened several times by firewalls, intrusion prevention devices, VPNs, anti-virus software, etc., creating an unacceptable amount of network delay directly affecting the business.
The basis of the age-old feud is clear – performance as dictated by the networking camp vs. a protected environment as advocated by the security folks. The uneasy Solomonic decision facing CIOs the world over is whether to skimp on security in order to ensure flawless application delivery or ensure the organization is safe by taking a performance hit in order to avoid becoming a security statistic.
Walk into any technology trade show anywhere in the world today and you will be inundated with vendors claiming you can have it all; they’ve cracked the code, promising wire speeds, built-in security, and performance without the trade-off. Rather than debate whether these claims will stand up in your network, it is important to get one simple question answered. What is the security technology protecting?
You see, along with the age-old feud of performance vs. security, there is another element that has not changed over time: the way we protect our networks. When networks were first built, the idea was to protect what was inside the network by fortifying the walls around it. This method of protection is very similar to how castles with large fortified walls, moats, and other countermeasures were built to protect what was inside.
Fast forward to today and it’s clear the “castle” security technologies no longer work for today’s open networks. With the advent of virtualization and cloud technologies, BYOD and distributed architectures, it is no longer possible to protect the perimeter because the perimeter no longer exists.
There is a better way to satisfy this age-old dispute of security versus performance. Perimeter based security protects an attack vector but not what the digital criminal is after. Protecting the data or the actual target of the attack makes more sense and eliminates the uncomfortable tradeoffs we have had to make in the past.
Protecting the target demands a change in mindset for how we approach security. Until now, the name of the game has been breach prevention, essentially ensuring the “good guys” stayed on the inside and the “bad guys” remained on the outside. Today, that mindset has proven to be largely impossible to sustain. With new stealthy and sophisticated attacks being launched daily, most organizations are beginning to realize it is no longer a question of whether the company is breached or not; it is simply a matter of when. The mindset is moving from “breach prevention” to “breach acceptance.” This sounds completely counterintuitive to anything we have ever learned when it comes to security; however, embracing this concept can improve our security posture to be better than it has ever been in recent memory. This can be accomplished by deploying the following:
- Encryption – Stolen data is only valuable if the hacker can use it. When data is encrypted, it is only useful to people who hold the decryption key. When a breach occurs and data falls into the wrong hands, encryption renders it useless gibberish to any non-intended user. Today, encryption can be deployed “end-to-end,” and that is what needs to happen, especially in a virtual environment. Traditionally, anytime data was moved between datacenters, it was encrypted. Nowadays, workloads may be moved within the datacenter and even between virtual machines. In such cases, it is crucial that these workloads are encrypted because they face as much risk moving between machines as when moving across large expanses.
- Authentication – Data is power and often the most valuable asset a company can own. When data falls into the hands of the wrong individual or group, it can have major security ramifications. Consider in the financial vertical where the compliance rule of “Chinese Walls” can be violated when a broker gets hold of data from a researcher or companies where a billing clerk has access to entire health records. Requiring authorized personnel to authenticate with multi-factor proof – with something they have, something they know and something they are – ensures only authorized personnel have access to specific bits of information. This granular form of access control exposes data only to the right people and eliminates exposing critical data to those that should not have it, thereby increasing security.
By deploying data protection technologies such as encryption and authentication, it is possible to secure what really matters without requiring constant inspection that can slow down a network and thus halt business.
Consider the continuous leapfrogging in technology that occurs when security professionals implement perimeter-based network security, and sophisticated hackers try to circumvent new security measures. It becomes clear network breaches are just a matter when, not if. Protecting the data is really the name of the game. Doing so ensures you have the security necessary to protect the business and the network performance required to operate at the speed of business.
That is something that both the network and security folks can sing Kumbaya over.
About the Author
Michael Rothschild works for the data security company, SafeNet. He holds advanced marketing degrees and certifications in IT security. He has been an EMT for 28 years, an ER nursing assistant for eight years and is an instructor for the American Heart Association.
Ben Linders May 28, 2015