Physical security for data centers is table stakes these days; operators are well-versed in how to secure their facilities. But as we move to the Edge and away from the idea of having people constantly on-site, some of these traditional approaches start to become more difficult.

While many companies say they offer the Edge, they are often talking about the ‘regional Edge’; a traditional data center of reasonable size that is manned 24/7. Securing the regional Edge is easy; these are manned facilities that have the same security controls as your ‘traditional’ data center and benefit from on-site security personnel if there is an incident.’

But there are some companies looking to do Edge at the tower and other unmanned sites sometimes called the telco Edge or far Edge. Securing the telco Edge is a more challenging question. These sites are smaller – offering less opportunity to put controls in – and they are unmanned. The lack of on-site personnel means site monitoring becomes much more important, as do security controls amid a much longer response time

Edge security: not a today problem?

The ‘lights-out’ unmanned data center has long been discussed, but there have been few real-world deployments, and none at any scale. Likewise, there aren’t huge numbers of telco or far Edge data centers around the world. But both are coming.

Several tower companies are launching telco Edge facilities, and as 5G and Internet of Things (IoT) & smart city deployments ramp up, the need for more telco/far Edge sites will increase.

American Towers Edge DC PA.png
– American Tower

American Tower currently operates a number of small Edge colocation sites at tower locations in Pittsburgh, Pennsylvania; Jacksonville, Florida; Atlanta, Georgia; Austin, Texas; and Denver and Boulder, Colorado. In July 2023 it filed to develop a new facility at a telecoms tower site in San Antonio, Texas.

American Tower has said it had identified more than 1,000 sites that could support 1MW Edge data center locations. 2023 also saw Qualcomm announce a strategic collaboration with the tower company, saying it would be installing a 2U Arm-based server at an American Tower Edge data center in Denver as part of a ‘new class of scalable computing resources at the near Edge’.

Another tower firm, SBA, recently launched an Edge data center at a tower site in Arlington, Texas. The company has said it has “between 40 and 50” sites in development. This year Vapor. io has deployed AWS Outpost hardware at a Cellnex tower site in Barcelona, Spain. Vapor said Cellnex has made a 'portfolio of small, carrier-neutral Edge data centers and tower ground space' available for hosting Vapor IO’s Kinetic Grid and AWS Outposts.

DigitalBridge-owned Verticle Bridge – a tower company with the same owners as DataBank – also offers microdata center construction services at its tower sites.

July 2023 saw US digital infrastructure firm Ubiquity acquire EdgePresence – a company that places data center pods at telecoms sites and previously counted DataBank amongst its investors.

The Edge becomes more sensitive

“In the States, they want Edge because of distance. I think in the UK, we're going to need Edge because of capacity,” says Stuart Priest, founder and managing director of SonicEdge, a UK-based provider of modular prefabricated data centers founded in 2020.

“Edge sites are so far and few currently that I can probably count them on my fingers and toes on proper Edge. But I think the landscape in 10 years is going to be completely different now; we're predicted to have a million IoT devices a year coming online in the UK.”

He continues: “Currently, there's enough bandwidth at the moment for everything that's going on around the Edge. Until we start getting a lot more IoT devices on the network; smart homes, sensors on street lamps and traffic lights, smart CCTV, etc, I don't think there's going to be a need to use those sites.”

Telecoms sites are rarely attacked – and usually any incidents are by anti-5G conspiracy theorists or vandals rather than thieves – as there’s generally little of value to steal that can be easily sold. That means security at some of the more remote or less critical sites could be as simple as a fence, a camera, and a lock. However, the prospect of placing expensive compute hardware into these sites means their value – and the cost of damage or downtime – increases substantially.

“When we get to a point where compute storage services will be going at those sites, we'll have to ramp the security up. There's absolutely no point building a cheap pod and putting a Rolls Royce inside. There's just no point in doing that, and customers understand that. But the costs will significantly increase for those sites.”

ModCel Secure IT Environments
– Secure I.T. Environments

Securing the Edge

Tower sites and their sheds are commonplace and have security measures, and many of the security controls applicable to existing modular deployments are just as relevant for unmanned Edge facilities.

Anti-climb fences are standard and come in various shapes and sizes, often with barbed wire or similar deterring topper. Some sites may have underground pressure sensors to detect when someone is walking somewhere unexpected, for example, on the inside of a perimeter fence.

Panels and doors have SR ratings – ranging from SR-1, where someone could break through with something like a screwdriver, to SR-8 where the facility could withstand small arms fire and a nearby blast. Priest says customers often ask for modules that are in the SR-3 to -4 rating – meaning it would take someone with power tools up to 30 minutes to break through a wall or door, giving operators and/or law enforcement time to get on-site.

“Whether it's a telco site or an Edge site, if there's no one in the vicinity, then obviously the key thing is to stop people get in, and, if they do get in, stop them getting into the pod so they've got enough time to call the police. Most companies want at least a 30-minute window.”

At a deployment in Aston, a suburb of Birmingham in the UK, SonicEdge deployed a module surrounded by an eight-foot anti-climb fence, SR-4-rated walls, and infrared CCTV, with tremble units on the fences and doors to detect movement.

Priest says SonicEdge paints many of its customers’ modules street furniture color – described as “a dull sort of green or grey – to make them less interesting to prospective thieves. Nick Ewing, MD of UK data center module provider EfficiencyIT, notes his company will paint modules various colors to the customer specification in order to better blend in with the surroundings.

“With the Edge you don't want it to stand out, these environments have got to blend into their surroundings,” he says.

Once past the perimeter fence, access to the actual data centers from the outside will be protected by a combination of biometrics, fingerprint scanning, key cards, PIN codes, and traditional key locks. Like doors and walls, locks can be rated depending on how difficult they are to compromise. In the UK, the CPNI has its own grading of lock types.

Once through the outer door, some data center designs will have a vestibule or anteroom with a secondary security door to further secure the data room – and extend the amount of time operators have to respond to intruders. And even then, racks inside the data halls can be locked down.

“We've got customers where we lock down the racks completely, and you've got to have a certain level of classification to even be allowed in a rack,” says Ewing.

Priest says in the three-plus years SonicEdge has been going, no two deployments have been the same.

“A lot of these companies, they want to use the same security across their estate. They don't want their engineers having to learn five or six different systems,” he notes.

Much of the discussions the company has around module deployments are asking customers what security systems they already use so that engineers can hit the ground running without additional training or systems to monitor.

“There isn’t a ‘one size fits all’ approach to these Edge buildouts,” says Elliott Turek, director of category management, Europe, Schneider Electric.

“Many providers want repeatable solutions to take advantage of the economies of scale for CapEx purposes and predictable servicing on the OpEx side, but ultimately there are constraints when it comes to practical action. These sites can be just about anywhere – from busy metropolitan centers to a wooded forest area miles from the nearest highway.”

vapor io kinetic grid installation.png
– Vapor IO

Automation & meshing key to Edge resiliency

Currently, with so few unmanned Edge deployments, there isn’t the same expectations of uptime as traditional data centers, and operational approaches aren’t really set up for a massive distributed footprint.

“I think at the moment because it hasn't taken off so much, the view is if it falls over, when we go and fix it, it will then be working,” says Priest. “The SLA aren't, for example, four hours to get it back up and running, because they are so few and far between. The view at the moment is, if it falls over, when we go and fix it, it will then be working.”

However, in the future if/when the Edge takes off and we do have a huge amount of sites processing and transmitting large amounts of IoT data, there will be a lot more reliance on interoperability between other sites to ensure resiliency in the face of security incidents or system failures.

“Some operators can afford N-level redundancy,” says Schneider’s Turek. “But, in the context of ‘lights out’ Edge sites, it’s a risk that outweighs any cost savings initially realized.” Priest adds: “There might be someone out there who's going to find £10 billion to build loads of Tier III Edge data centers all around the UK but I just can't see it myself.”

“The consensus at the moment, speaking to my peers, is in terms of SLAs the actual running of the data center will be probably a Tier I,” he adds. “Because there's going to be so many of them that if one falls over, it's just like a WiFi network. The next one will pick it up until it's brought back online.”

“But the security won't be Tier I, because the last thing those customers want who are putting their information in those hubs is that someone can get in there and take it. If we get to where there £100,000 worth of compute sitting in remote data centers, then they will be like Fort Knox.”

Software and automation become increasingly important in large Edge deployments. While operators can currently manually manage the comings and goings and operations of a handful of remote sites with enough planning, it could quickly become overwhelming if the footprint grows. Likewise, ensuring backup and failure systems are automated will be critical to ensure availability of services.

“In a distributed environment, you are always working on the basis that one of your nodes is going to fail and your network and infrastructure will be built in a way that you have you have that failover capability,” says Ewing.

Predictive analytics can take away some of the pains of component or system failure – potentially giving notice of incidents before they happen and giving operators enough notice to prepare for or prevent potential problems.

“The beauty with software that we have available to us now is that you can almost predict a lot of these things happening,” says Ewing. “You've got cloud-based applications where you can get predictive analytics on if UPS is going to fail this year or you might want to change a battery out.”

Ewing also notes pre-programmed access with a temporary key card could give customers or third-party contractors access at specific dates and times, pre-programming this is also important when there are multiple sites to manage.

EfficiencyIT ModularDC.jpg
– EfficiencyIT | ModularDC

Does Edge security need more redundancy?

The data center industry loves redundancy; millions of dollars are spent making cooling, power, and fiber systems redundant to ensure uptime. This is also true at the Edge; Priest noted a UK telecoms customer was looking to deploy remote sites in the near future with on-site solar and 48-hours of fuel for backup generators to ensure uptime if the grid fails.

But while customers do care about security and will pay what is required to secure sites to the desired degree, both Priest and Ewing concede there is often little consideration in adding ‘N+X’ redundancy to security systems.

“In terms of backup, we don't just have one camera on one site, we have multiple cameras. And then we also use the other camera on a stalk, which overlooks the whole facility,” says Priest. “So there's a number of redundancies in there to be able to manage that so that it covers multiple angles.”

However, while the cameras may overlap to an extent, if the control systems for access controls are on-site in the data center itself, they often don’t have multiple redundant systems.

“Sometimes these do become single points of failure,” says EfficiencyIT’s Ewing. “You generally have one head end rather than two for example.”

He notes that such systems are also generally designed to fail open. If power is lost to a biometric system, for example, it would generally be set to fail open on health and safety grounds. Instead of traditional redundancy, security is often more about layers – the so-called strength in depth.

“In terms of access control, you've got biometric, you've got keypad, you've got some retina these days, and also there's there is a fullback, which is a key system,” says Priest. “Depending on which way they go, there's always basically two ways of entering; you'll have for example a keypad and a lock or a biometric and the lock.”

Ultimately, the resiliency of the facility will have to be dictated by the customers and the workloads in the facility.

“Is it life critical, is it business critical, is it mission critical?” says Ewing. “[If it's serving a] local hospital, I want them to be redundant. I want them to have fail-safes in place so that the work they're doing is supported and is uninterrupted.”