India accounts for around 15 percent of Facebook’s global consumer base and is the largest market for Facebook globally. For a country of a billion people with around 400 million users on Facebook, the demand for data center capacity, just from the Facebook set of applications, is around 450 MW in 2021. Similarly, for Google, the estimated data center demand from its 225 million YouTube users is around 200 MW in 2021.

The total installed capacity for data centers in India in 2021 is estimated to be 450 MW, and Google and Facebook, combined, have less than 50 MW data center capacity leased or built in India. The balance 600MW of data center demand from Google and Facebook leaves the country shores to be served from Singapore, the US and the Americas. Similar leakage numbers for Indonesia are estimated to be 325 MW and for the Philippines to be 170 MW.

This traffic migration, especially for offline workloads and storage is likely to increase as new subsea cables come online.

New cables on their way

The BiFrost cable system, expected to be ready for service from 2024, will connect Singapore, Indonesia and the Philippines to the West coast of the United States. Facebook is one of the anchor investors in BiFrost. Facebook also owns a pair in the six-pair PLCN cable system (as shown in Figure 1) the highest capacity (144Tbps) subsea cable system to connect the Philippines directly to the United States. Google owns another pair in PLCN. The Apricot cable system is also planned for 2024 - a Google owned 190Tbps subsea cable connecting Singapore, Indonesia and the Philippines to Taiwan (where Google has 300MW+ hyperscale capacity) and the United States. Finally, the Echo cable, again invested by Google, will connect Singapore and Indonesia directly to the United States.

Such a gold rush for subsea cables is informed by consumer behavior and of course economics.

Facebook offers a service that is ad-funded, with no payments by end users. In markets such as India, there are no homegrown platforms (such as WeChat or Line) to compete with Facebook. Consumers are price takers and do not have any benchmark for a better user experience in terms of latency and quality, or any other options. Facebook's engineering team dimensions their network for a ‘consistent user experience, meaning that if a video loads in, say, 30 milliseconds Facebook will make all efforts to make sure it consistently loads in 30 milliseconds rather than work on reducing it to 20 milliseconds. Facebook users that are used to sites and images loading in a specific duration do not notice, or even complain, as long as the service is free.

Facebook is also push-led, unlike Google. A YouTube user can go and search for a specific video; however, a Facebook user is just swiping through short videos (Reels), Instagram stories, images pushed to WhatsApp groups, with almost no search-led demand. Facebook decides what we consume and how we consume it. And so, Facebook decides to dimension the network in such a way that it’s a least cost network for this defined user experience.

Google can also do the same, especially for free YouTube workloads. However, they are handicapped by search, and also by the transformational push to convert YouTube into a consumer paid, TV channel type of model. Google also has Google Cloud offerings for enterprises with SLA commitments and needs some on-shore capacity. It’s the same for Amazon with its AWS cloud, e-commerce and Prime Video, all of which are paid and necessitate on-shore capacity, especially in the markets where Amazon is strong. In India, AWS is building a captive, 600MW data center in Hyderabad. Google also announced a cloud region in Delhi in India, likely to be around 100MW.

Remote data centers are cheaper

The economics of serving data using owned subsea cables is generally always favorable to building or leasing data center capacity. The Facebook data centers in Prineville have racks stacked with disk storage, that are completely air cooled, with no mechanical chillers or air conditioning. The cost of building such warehouse type data center facilities is not more than $3 million per MW as compared to $9 million per MW in Indonesia (and the Philippines) and $8 million per MW in India. For a demand of 600MW to 800MW from emerging Asia, Facebook saves at least $6 million per MW, translating to $4 billion to $5 billion in capex savings, and additional billions in opex savings.

Facebook is also part of another cable (2Africa) that is 37,000 km long and will cost $1 billion; of this, Facebook’s share will be less than $200 million. BiFrost is 15,000 km, Echo is 16,000 km, PLCN is 13,000 km and Apricot is 12,000 km in length. If we combine all of these cables connecting emerging Asia to the United States, their length will be less than 100,000 km, and assuming Facebook has a share of 15 percent cable capacity on these routes, it will not spend more than $400 million to $500 million on these cables. The cost savings are crystal clear.

And finally, there are operational considerations. The Philippines has a fault line in Luzon with high seismic risks, its energy costs are among the highest in Asia and renewable energy access is limited. Indonesia is on the ring of fire and availability of trained manpower is a concern. Malaysia is planning to regulate data center operators and make them liable to pay a revenue share. Vietnam is proposing cyber security laws that can allow raids on data centers and seizure of sensitive materials. However, global Cloud providers have much easier and cheaper options.

Policymakers, telco data centers and operators in emerging Asian markets view subsea cables as a positive trend. They believe that with increasing connectivity, these markets will emerge as regional hubs, such as a Singapore or Hong Kong. The current traffic model and installed data center capacities suggests otherwise. The subsea cables will drain away all the workloads from emerging Asia to large farms in the cooler, greener and calmer climes of the temperate zones.

Get a roundup of the latest regional news across Asia fortnightly