fbpx
Techitup Middle East
Expert Opinion

Uncovering the Encrypted Threats Flowing through your Data Centre 

Network visibility is a fundamental security objective in today’s environment. The ability to monitor suspicious traffic not only strengthens threat detection but also informs comprehensive security posture management, laying the groundwork for a robust Zero Trust strategy. 

The Dual Role of Encryption: Security and Threat 

It is no secret that data centres are processing an immense amount of traffic. As business data grows and organizations move into increasingly hybrid cloud environments, monitoring all of this traffic for security risks becomes a real challenge. Malicious actors often hide in networks to exfiltrate sensitive data and maximize their impact, meaning data centre operators now need to have oversight over all East-West traffic for effective threat detection. Despite network visibility being a known security priority, Gigamon research found that less than half (48 percent) of organizations have insight into data moving laterally across their networks. 

At the same time, security leaders have turned to encryption to hide information from prying eyes. Today, encryption makes up over 90 percent of all internet traffic, and the proportion of encrypted traffic within internal enterprise networks is growing fast. While encryption is a proven way to protect sensitive information, it also contributes to poor network visibility without robust decryption and analysis.  

Cybercriminals are becoming increasingly adept at using our security methods against us, and encrypted traffic is a powerful tool in their playbook. The same confidentiality that protects data also hides malware, suspicious activity, and data exfiltration from our security teams. This doesn’t mean encryption is failing, but data centre security needs to step up to the challenge of decrypting traffic for better threat detection and response. 

The Reality of Encrypted Threats 

It might sound like a cliché, but the first step towards preventing encrypted threats is to acknowledge the risk they pose. It is impossible to defend against a threat you cannot see and can’t measure within your own data centres. 

Encryption is playing a larger role in attacks, and with great success. Over 90 percent of malware attacks now use encryption to evade detection, and yet a Gigamon survey found that only 30 percent of IT and security professionals have visibility into encrypted traffic. Businesses already understand the danger of security blind spots – over half of respondents in the same survey named it their top concern – but until now the practical challenges have been holding security teams back. The result is that these accepted un-analyzed threats are contributing to a threat landscape in which almost 1 in 3 successful breaches go undetected. 

In recent years, a rise in microservices and the increasing adoption of containerized applications have exacerbated this trend by increasing internal traffic. Machine-to-machine or server-to-server traffic is the perfect means for an attack to proliferate, attack, and spread. Not letting lateral movement go unmonitored has become much more important in the last few years. 

Challenges with Traditional Decryption and the Promise of Precryption™ 
 
Unfortunately, traditional decryption is just not a viable solution for many data centre operators. Decryption efforts usually take place at the perimeter, decrypted at the firewall, in an appliance, or using a load balancer. These methods require a lot of configuration, key management, and vast amounts of compute power to break encryption, inspect the contents, and then re-encrypt. When most traffic is encrypted, it adds up. 

There are a few tricks to save resources in the decryption process. Businesses investing in decryption can take steps to minimize redundant traffic inspection and identify which network packets should be prioritized. Tactics such as application filtering, in which traffic signature is used to distinguish between high- and low- risk applications, enable teams to apply risk-based decryption across their data centres. However, it’s important to remember that these assessments are not foolproof. 

Another similar, low-cost approach lies in deploying data management strategies to reduce the traffic flow across a data centre and achieve the necessary visibility to assess risk. Data centre networks are structured with resiliency and availability in mind, but this approach creates duplicate network packets across a network. In the costly case of decryption, implementing deduplication ensures that each network packet is only analyzed once. 

These approaches are a good first step to reduce the financial and compute drain of a more robust decryption effort, but the reality is that inspecting all traffic is the only way to stop encrypted threats. To meet this need, today’s fast-growing networks need a more efficient decryption solution that is low cost, low CPU, and simple. Precryption™ technology is the solution. 

How is Precryption different from conventional approaches? 

Technology is always evolving, and decryption is no different. It is now possible to rethink the challenge of encrypted traffic visibility, and Precryption resolves the issue by going directly to the source. 

Without a plausible, affordable decryption method, data centre operators and owners have largely accepted defeat when it comes to encrypted threats – or invest in a very heavy solution that requires large-scale changes to networks and drains energy, space, and budget. But inside each network packet lies the key to a new approach. Precryption works by accessing the Linux kernel to get plain text traffic visibility before encryption is ever performed. In doing this, security teams use less compute power to analyze more traffic. By cutting the decryption and re-encryption stages out of the equation, it also allows security tools to detect threats quickly even as traffic continues to increase. 

Until now, encrypted threats have been thriving in a perfect storm of low visibility, high traffic, and optimistic risk assessments. Now is the time to face up to this risk and pursue true traffic visibility. 

Related posts

Who’s Responsible for Your Business’ Cloud Storage? How Can You Optimize it?

Editor

From Monitoring to Observability, Getting IT Teams On Board

Editor

Businesses Must Learn to Live with Ransomware

Editor

Leave a Comment