Input latency and visual lag can have an outsized and negative impact on the gamer’s experience, especially for new cloud gaming platforms, if servers reside in centralized cloud data centers that are located hundreds of miles away or more from the end user.
Whereas centralized cloud service providers require companies to keep content in a single location, Edge computing enables the distribution of application processes at the edge of the network and as near to the user as possible. The Edge enables new cloud gaming platforms to eliminate the need of dedicated devices, such as a console or high-end personal computer, while helping solve the latency issues in transferring data from the cloud to the user and the rendering of graphically intensive video.
Many online gaming experiences, especially games with multiplayer functionality, are highly interactive in nature. Multiplayer gaming demands real-time response rates, which the cloud, despite its many advantages, cannot provide solely on its own due to the geographical location of cloud data centers. As a distributed computing topology, Edge computing ensures users across the globe can enjoy a high-quality gaming experience with minimal latency regardless of where they are playing.
The combination of cloud and Edge computing also creates a more flexible platform that provides gaming developers and publishers with the ability to scale. At the same time, Edge computing allows gamers to move seamlessly between different locations and varying devices. Hence, a Red Dead Redemption shoot-out that began at home during breakfast on a 4K television can continue on an iPhone on the commute to work - provided the gamer is not driving.
East-West, North-South, Game On
In the past few years, augmented and virtual reality (AR/VR) devices have become increasingly popular in enhancing the gaming experience, which in turn has placed added pressure on the cloud infrastructure supporting these games. Once again, latency and responsiveness are the major challenges for both game developers and the cloud gaming community, but major cloud providers are answering the call.
Google’s cloud gaming platform, Stadia, supports multiple game engines, including Unreal and Unity. Stadia, similar to Microsoft’s xCloud, is designed to create a more seamless relationship between game developers and end users by offering a host of tools to enhance the gaming community and experience from content development to distribution. But without the right distribution strategy, the onus of which is on the major cloud providers and the leading developers, the games carry a high risk of performing below user expectations. So, while it can be said that the future of gaming is in the cloud, the centralized cloud data center on its own is not sufficient to bring these platforms to life.
Major cloud service providers with gaming platforms typically lease space in colocation facilities, at the edge of the network. The most effective colocation providers in this business case have created a fertile network ecosystem that makes it easy to interconnect with carriers, content, mobile, wireless and ISPs. These carrier hotels and Edge data centers operate as network traffic hubs that distribute data directly to where it needs to go. When cloud gaming companies colocate their IT infrastructure in proximity to network providers, this allows gaming businesses to maintain the low latency and optimized performance that demanding gamers expect.
A secure, interconnected carrier hotel or Edge data center can meet the proximity, low-latency and rapid scalability requirements that cloud gaming customers demand for their data-intensive workloads.
Overwhelming current networks, more data today is being generated at edge endpoints than ever before. For this reason, ultra-low latency, geographic proximity, and local access to multiple cloud options is essential to both established gaming providers and independent gaming developers to ensure uninterrupted play and a high quality end-user experience.