Growth, digitisation and efficiency are the three primary objectives of organisations around the world, according to Gartner. ‘Cloudification,’ accelerated by the pandemic and promoted, in particular, by European governments’ aims to accelerate the digital transition, helps address these challenges.

Cloud computing gives businesses the freedom to work, innovate, and collaborate anywhere, at any time, through agile, secure platforms, on clouds of their choice (public, private, and hybrid). By 2025, cloud-native platforms will host more than 95 percent of new digital initiatives – up 40 percent on 2021, according to Gartner forecasts.

Organizations are currently turning to multi-cloud and hybrid cloud, leading to other innovations, such as containers and micro-services, to improve workload portability in the cloud. New models, such as DCIaaS (Dedicated Cloud Infrastructure as a Service), are also being developed. According to IDC, this sector will be growing at more than 150 percent per year by 2025, making it possible to enjoy all the benefits of public cloud, both on-premises and in a colocation environment.

In parallel with a brand new French law intended to reduce the environmental footprint of digital technology, the objective of which is to reconcile digital with ecological principles, all these parameters combine to build the profile of new, more environmentally friendly, secure and flexible data centers, well-suited to the post-Covid world.

1) Post-covid data centers will be more flexible and on the edge

Regional data centers have continued their rapid growth, further legitimised by the health context and the crucial advantages of a more widely distributed IT infrastructure. This trend will accelerate in the coming years to support our post-Covid digital economy (encompassing teleworking, video streaming, development of e-commerce, IoT, 5G and so on).

While availability remains the utmost priority, these Edge data centers are also more agile and energy-efficient while providing lower latency. Their ultra-modular design gives them an edge in terms of energy performance.

In addition, deployed in a distributed mode, more and more of them will guarantee high availability by means of distributed resilience mechanisms. Their design is lighter (built to Uptime Institute Tier 2 standards) and they will be able to shift workloads according to the origin of the energy consumed, thus providing businesses with quick backup solutions while reducing their environmental impact.

2) Will ensure better protection against attacks

According to estimates, the volume of data generated worldwide will exceed 180 zettabytes by 2025. That amounts to an average annual growth rate of nearly 40 percent over five years1.

This staggering mass of data could be an invaluable source of critical information for an organization, helping with their strategic decision-making.

While the data center is the ‘digital safe’ of this new gold rush, security issues still need to be addressed via tighter controls in terms of logical as well as physical security, such as building access-control using drones and increasingly advanced facial recognition to improve overall security.

In addition, ‘zero trust’ architectures will be increasingly adopted, preventing any user or device from connecting to the network without permission. For this model to work, organizations must micro-segment all their resources, enforce strict access controls, but also inspect and record all network traffic by taking into account terminals, workloads and data.

Furthermore, to reinforce the security of data outsourced to these infrastructures, the majority of businesses will opt for multi-cloud architectures, such as those of Azure or Amazon. Businesses will soon rely on advanced encryption technologies, such as homomorphic encryption, which applies a mathematical calculation system to data without having to decrypt it.

3) Will respond to the challenges of high density

Fierce competition between Intel and AMD to deliver ever-better CPUs has given data centers a welcome boost in terms of innovation and power to meet intensive computational needs, in particular those required by AI.

However, this high density raises a major issue: by hosting increasingly dense infrastructures in a restricted space, data centers are faced with the challenge of efficiently dissipating the heat produced. This is why, in the coming years, we should witness the gradual, combined deployment of traditional cooling on low-density racks alongside liquid cooling on high-density racks.

4) Will make greater use of digital tools

The Covid-19 pandemic, combined with the flow of innovations based on software automation and artificial intelligence, increased the need to ramp up the development of data centers, while making their operations less dependent on humans. The adoption of increasingly streamlined construction models, such as building information modelling (BIM), will help speed up the steps and customise design in keeping with demand, while factoring in security.

By facilitating the running of upstream computer simulations and tests (electricity, access control, evacuation scenarios, heating costs and so on), the concept of ‘digital twin’ will help avoid potential errors prior to physical construction.

In addition, at a time when buildings need to improve their energy efficiency, this is a useful step towards optimising predictive maintenance. All that remains is to use modelling tools, from design to maintenance, governed by the same standards and communicating with each other.

5) And will standardise the calculation of their environmental footprint

The data center industry is keen to ramp up the development and adoption of sustainable practices in order to limit the industry’s impact on climate change. This environmental responsibility is exacerbated by pressure from market regulators, investors and customers. It is conducive to continuous innovation, as epitomised by liquid cooling, the use of renewable energy and even underwater data centers.

The sustainability of data centers has become a priority and their measurement a constant: installation must be optimised, from design through to operation and throughout the life cycle, by involving all stakeholders within the ecosystem.

However, even if they measure their environmental impact, everyone will need to use the same assessment criteria to enable meaningful comparisons to made between facilities and operators.

1. Sources: IDC, Seagate, Statista