Today, with a veritable smorgasbord of applications and websites vying for our attention, it is little wonder we have short attention spans.

“If a user doesn’t find something to watch on a streaming service in around 30-to-60 seconds, they will turn it off and go and do something else,” says Frank Scalzo, general manager, network strategy and services at Iron Mountain Data Centers.

Streaming giant Netflix recently announced a loss of more than 200,000 subscribers in the first three months of 2022 and expects to shed another two million this quarter. And although there is speculation as to the cause, this need for speed cannot be discounted.

These demanding expectations aren’t unique to streaming services. According to LoadStorm Econsultancy, one in four users will abandon a website if it takes more than four seconds to load. Sixty four percent of shoppers dissatisfied with a web page would shop elsewhere next time, and a delay of as little as one second could reduce customer satisfaction by up to 16 percent.

It’s becoming increasingly important to reduce that latency and get people the services they want, faster. And if you don’t, someone else will.

What’s driving the Edge?

There are a number of markets that traditionally, would have been around 20-30 seconds away from their connection point. Ten years ago, this might have been viable, but as users become increasingly reliant on online services, this is no longer the case.

Our everyday interaction with the internet is a world away from what it was a decade ago. Not only is traditional internet growth driving the need to get content closer to users via the Edge, the proliferation of the Internet of Things (IoT) and its subsequent connected devices has further upped the ante, completely transforming the digital landscape.

IoT encompasses any additional device not previously connected to the internet that is now. From smart thermostats and devices in our homes, to automated vehicles, to the SCADA systems used in smart manufacturing, IoT devices are everywhere, and they are generating tremendous amounts of data.

“If you look at what a Tesla car generates a day in terms of data, it’s well over a terabyte,” says Scalzo, providing some perspective.

This massive amount of data is not only critical to the functionality of the vehicle itself, but is being used alongside other data sets to train the AI and machine learning responsible for developing self-driving technologies in the future.

For Tesla and so many other companies collecting vast swathes of critical data, being able to get that data from the IoT device in question, back to somewhere local for analysis in order to figure out what’s important, triage it and then get it back out to a major point, is incredibly important.

The Edge ecosystem

In the ‘old world’ as Scalzo describes it, we used to think of the data center very much in terms of north and south connectivity. Northbound being internet connectivity to provide services to the public. Southbound being internal communications for management, data replication, API calls, and the like being provided by VPNs, dedicated circuits, or MPLS based solutions.

Monkeybusinessimages.jpg
With gaming, latency matters, and it comes with incredible bandwidth needs. Games aren’t just games, they’re also transaction platforms - Frank Scalzo, Iron Mountain – Monkeybusinessimages via Getty

But in today’s world, it’s much more complex than that. A typical transaction has a lot more moving parts. An application may be talking to AWS for example or talking to a partner system for payments.

A great illustration of this need for multi-directional connectivity lies within the gaming industry. Long gone are the days you’d purchase a disk, physically put it in a console or PC and play with very few added extras.

“With online gaming, latency matters, and it comes with incredible bandwidth needs,” says Scalzo. “Games aren’t just games, they’re also transaction platforms. They’re buying and selling things to their users, so they need to be able to process credit card payments.”

And it’s not just a case of reducing latency to improve the in-game experience. Gaming platforms also display ads, featuring incredibly complex ad tech environments.

“There’s a reason why if you search for something, 20 seconds later you’re seeing ads all over your social news feeds,” explains Scalzo. “That technology figuring out what ad to display to what buyer when, is incredibly complex, so reducing the latency on that is critical to make sure they’re getting the right ad in front of the right person.”

And competition is fierce. Often, it’s not even about the product, but a question of who pops up the ad recommendation first. “Sometimes it’ll be two or three companies competing on the back side and whoever has the first recommendation is going to get to show the ad,” says Scalzo.

So, it’s safe to say connectivity is veering way off the beaten track, with a lot more connectivity spanning east and west. No longer are we building self-sufficient internet services, but internet services with complex interdependencies on various partners and providers in order to function.

This interconnectivity is no longer a nice-to-have, it’s a necessity. It’s where we see real value in an Edge ecosystem, ensuring we can start to move the points where we connect east and west, out of major connection points and closer to where the consumers are.

And moving away from these major connection points doesn’t only make sense from an interconnectivity perspective, but from the standpoint of resiliency and reliability.

In any major data center market, the prospect of a regional power outage is the stuff of nightmares. And although uncommon, the failure of a major regional power grid is not beyond the realms of possibility.

“The more connectivity we can decentralize out of the major strategic internet connection points, the better off the internet is from a resiliency perspective,” says Scalzo. “The internet is incredibly resilient, and it’d be up and functional, there’d just be so much traffic pushed onto unusual pathways that they’d be congested beyond usability,” he says.

Thus far, this is an event nobody has experienced with the internet at the scale it is today, and that’s the way we want it to stay. One of the ways we can do that is via a renewed focus on decentralized control.

If we can push that control and interconnection out to more markets, we can become less dependent on those central control points and in better stead should disaster strike.

Automating the infrastructure

Having run networks for the last 25 years, Scalzo remembers a time (not so long ago) where the predominant way of configuring routers was via logging in and manually interacting with a command line interface.

This human driven method, he says, is as time consuming as it is error prone, creating a very niche market for labor that’s difficult to fill. This is why a lot of infrastructure is turning to orchestration and automation, something the use of Application Programming Interfaces (APIs) can help to achieve.

“APIs are incredibly strategically important,” says Scalzo. “APIs enable us to have software sit in the background and manage the infrastructure. From software we get consistency, and we get speed. When humans are involved, you limit the hours you can make changes, you’re limited in how many you can make per day, and you have an error rate associated with it.”

In light of this, over the last couple of years there has been a significant shift towards more software-defined networking, enabling operators to dynamically manage the data flow of their systems and network capacity and ultimately, adapt to user demand in real time.

This kind of elasticity becomes an incredibly powerful tool because you can pay for what you’re using when you need it. Automation also helps eliminate these incredibly siloed labor markets, which do nothing to help alleviate the skills shortage the data center industry is currently experiencing.

“Anything we can do to automate helps alleviate the fact that we need more people than we can hire right now. There’s definitely more demand than there is labor and I don’t see that easing up anytime soon,” says Scalzo. But rather than looking at automation from a job loss perspective, Scalzo says we should see it as an opportunity to scale efficiently within the constraints of what we have.

And it’s no secret this is an industry seriously lacking in diversity. Therefore, the ability to decentralize the workforce means operators can hire people from different markets, with different stories and different life perspectives.

“Iron Mountain has a very healthy and robust diversity, equality and inclusion (DEI) program,” says Scalzo. “We focus on bringing those that might traditionally be under-represented into jobs.”

This not only helps provide economic advantages to operators, as labor rates vary in different parts of the world, but helps prevent the baked-in bias associated with continually hiring from the same pool.

The Edge for life?

Ultimately, when we enable the Edge, we enable a lifestyle, and in some respects, life itself. Operating at the Edge satisfies our insatiable need for speed, affording us the applications and services we’ve grown accustomed to and even dependent on in our everyday lives.

The Edge not only helps us to progress in the present, but has the power to shape our future, whether that be via developments in AI, the workforce, or helping to secure the resiliency of the internet, a utility that is now deemed a human right.

As the world continues its digital transformation from paper to data center to Edge, companies like Iron Mountain are there not only as a vendor, but a trusted partner, helping businesses navigate their way through a continually evolving and often overwhelming digital landscape.