Encrypting data traffic is a good thing for security, right? The answer is a qualified ‘yes’ – it’s true to a certain extent, but like many good things, how it is done can make all the difference. You want to avoid increasing security at the expense of your business.

For most data centres, SSL traffic is already estimated to average between 15% and 25% of total web traffic, with some vertical market segments seeing still higher volumes. Compliance regulations such as PCI-DSS and HIPAA require businesses to encrypt sensitive data (such as that travelling to banking, merchant or healthcare-related websites) in transit. Critical business applications like Microsoft Exchange, Salesforce.com and Dropbox are also heeding the privacy calls by enabling SSL, as are a range of hugely popular web destinations such as LinkedIn, Twitter, Facebook and others.

Encryption is good for protecting data, but as we know it equally protects would-be attackers. More sophisticated malware along with more subtle but vital indicators of potential attacks are being hidden in SSL-encrypted traffic. Encrypting data streams makes security monitoring, application monitoring, security analytics and even resource planning more difficult for IT teams as they attempt to decrypt and inspect it all in search of anomalies.

What you can’t see…

Encryption
– Jason Reed, Thinkstock

So what may be stealthily lurking within SSL encrypted data? There are two key risks that data centre’s need to be aware of: tangible threats, and more subtle threat indicators.

‘Threats’ generally means malware – malicious code that, like benign SSL traffic, is disguised by the encryption processes. Threat indicators are signs that a malicious party is probing or scanning the network looking for vulnerabilities – they are evidence of potential hacks, or network intrusion attempts.

Identifying and responding to both is a key part of an IT team’s mission, but growing SSL traffic makes these tasks take more time and require special analysis tools.

Performance matters

This, of course is where a data centre’s next-generation firewalls and application monitoring tools come into play. Firewalls are able to decrypt and scan SSL data, and in turn inspect for malware, or protect against intrusion attempts according to security policies.

But these capabilities come with a caveat. As IT teams enable more functions on their firewalls, such as AV, anti-bot, IPS, URL filtering and application control, this suite of security tools consume more and more computing time – which runs the risk of the security suite becoming a bottleneck, acting as a brake on the network or requiring a complete security suite upgrade to handle the needed performance.

Research by Enterprise Strategy Group (ESG) in 2015 found that 24% of businesses said their networking team was suspicious of technology that might disrupt critical traffic or damage performance. Yet the processing overheads inherent in using security gateways to decrypt SSL traffic data, in addition to their normal duties, will usually have a significant performance impact – particularly as both the business, and data volumes grow. A 2013 test of 7 next-generation firewalls by NSS Labs found that the devices experienced an average performance loss of 74% when using basic 512b and 1024b ciphers.

State of inspection

So how can data centres eliminate blind spots created by the added encryption security and gain visibility of what might be lurking in that hidden traffic, without compromising their overall network performance? First, they need to get full, unobscured access to all traffic across their environments. This is done by using stateful SSL decryption, extending security teams’ abilities to look into encrypted traffic from both business and web applications, to reveal any hidden anomalies such as network reconnaissance attempts and intrusions, or malware.

Stateful SSL decryption provides complete session information, helping IT teams to better understand the transaction as a whole and the potential start of any attacks, in contrast to stateless decryption that only provides the raw data packets.

Second, the stateful SSL decryption should be done using a dedicated platform, such as a network packet broker, to offload the extra processing burden from firewalls, security gateways and application monitoring tools. This helps to maximize performance and capacity of these tools, and the applications they run, such as antivirus, sandboxes and IPS by reducing their workload, enabling them to better identify and respond to targeted attacks. Inserting that network packet broker on a thoughtfully architected bypass foundation, network availability and reliability can be maximized. This unlocks the full potential of security and network architectures, while giving IT and security teams full visibility of all network traffic, both encrypted and unencrypted.

SSL encryption plays a vital role in securing sensitive data against unlawful interception and hacking, enabling organizations to meet their compliance requirements. But it can also conceal security threats, and limit network teams’ abilities to inspect, tune and optimize the performance of applications. It is vital that data centres protect their customer’s traffic. It is equally vital that they eliminate this encryption blind spot, and gain full visibility into what is truly happening in their networks and mission-critical applications. It is possible to have both with the right architecture.

Jeff Harris is senior director, solutions marketing for security vendor Ixia.