Analyst company ABI Research has predicted that the Internet of Things (IoT) will generate 1.6 zettabytes of data (roughly 1.6 billion terabytes) by 2020 – a startling figure that will present both business opportunities and logistical challenges to data center owners and occupiers.

The IoT is in many ways an evolution of earlier machine-to-machine (M2M) networks, but it expands point-to-point communications between a machine or device and a remote computer backed by cloud storage, management and data collection processes to include a much broader range of devices, as well as software applications, industrial and agricultural control systems, and even people.

iot juniper supplement right 2
– dcd

Many applications

Potential IoT applications and use cases are as broad as they are long, with automotive, consumer electronics, utility, manufacturing, travel and healthcare industries all featuring strongly. IoT-connected cars are expected to extend way beyond mobile entertainment, locational tracking and links to music, video, social media and other apps to include the mechanical operations focused on service updates, component monitoring and failure notifications, alongside inter-car communications and autonomous or semi-autonomous driving.

Electricity, gas and water suppliers are expected to invest equally as much in IoT-supported supply chain operations as they do in smart meters, which feed back usage metrics from business and residential premises, while smart cities will look to further improve smart parking, connected waste, public transport and traffic management services through larger networks of sensors embedded in vehicles, bus stops, lamp posts and other street furniture.

Elsewhere, kitchen appliances providing real-time inventory data collection and ordering of shopping lists may transform food supply chains and retail food services, and perhaps even deliver benefits to the healthcare and insurance industries by collecting information on people’s dietary habits. All of these examples represent just a small cross-section of the types of application which the IoT will enable, with many more use cases waiting to be discovered once the technology is better established and understood.

While not all of the zettabytes of data generated by the billions of IoT devices expected to come online in the next five years will need to be stored, processed and analyzed, a significant percentage of the total will find its way into a data center somewhere. And whether those facilities are owned and operated by the data owners themselves, or run as third-party facilities by colocation companies and cloud service providers delivering infrastructure as a service (IaaS), database as a service and/or hosted analytics or business intelligence SaaS applications, the challenges presented by IoT convergence remain the same.

A service provider play

Research company IDC forecasts that by 2019, more than 90 per cent of all IoT data will be hosted on service provider platforms, as cloud computing reduces the complexity of supporting what the research company dubbed ‘data blending’, also predicting that 10 per cent of sites would see their networks overwhelmed by data generated by the IoT.

Maintaining sufficient capacity to store all that data will be one crucial element, as will storage management frameworks that decide how long information is kept, when and where it is archived, and how quickly it needs to be accessed by end users. And the requirement to regularly back up and replicate huge file volumes across multiple hosting facilities will be equally demanding.

While some of the data generated by the IoT at collection point is likely to travel over cellular networks run by mobile operators, most of it will be transmitted on wireline or unlicensed wireless networks within homes, cars, offices and other private network environments, all of which will feed into telecommunications backbones that in turn transfer information en masse into data centers.

By 2019, more than 90 per cent of all IoT data will be hosted by service providers, as cloud computing reduces the complexity of supporting ‘data blending’


In-bound traffic is likely to increase exponentially, and will force many data center owners to upgrade their wide area network (WAN) performance to 40 Gigabit Ethernet (40GbE) or even 100 Gigabit Ethernet (100GbE).

It is a similar story for the local area network (LAN), tasked with meeting input/output requirements and the need to transfer huge volumes of information between servers and storage elements, where the bandwidth in backbones, edge and top of rack may simply not be enough to prevent bottlenecks that severely hamper application and service performance.

Get a policy

Data management and security policies will be key. Much of the information generated by the IoT will come from end-user devices – wearables, in-car GPS trackers and personal healthcare-monitoring devices such as fitness trackers, home blood pressure monitors or even MRI machines in hospitals and GP surgeries, for example. Much of this will inevitably be subject to strict national and/or regional data protection regulation. This means that data center owners hosting that data will have to be more attentive to relevant legislation than ever, and build data protection and information security certification guarantees into any service-level agreements provided to customers storing data within their facilities.

Processing all that data to derive meaningful insight – whether by determining usage and performance trends for business intelligence purposes, or building databases of critical information that, subject to data privacy laws and/or consumer opt-in consent, can be sold on to third parties – will also demand powerful ‘big data’ analytics capabilities. Only the latest generation of fast, energy-efficient servers, groups of servers configured in high performance computing (HPC) clusters, are likely to be able to provide this number-crunching capacity without creating significant power or performance problems for data centers already struggling with electricity availability and cost issues. Owners may have to examine technologies such as Infiniband and photonics to enable fast server and storage interconnects.

Super-scale data centers are already gearing up to address the commercial opportunity that the IoT presents, with Microsoft more advanced than most in integrating IoT platforms with the data center capabilities associated with its Windows Azure cloud services.

Azure Stream Analytics is a service designed from the ground up to provide businesses with IoT data, storage, processing, analytics, data visualization and business intelligence capabilities, and it would be no surprise if the software giant was able to forge partnerships with the world’s telcos and mobile operators and network equipment providers to handle the connectivity piece too.

iot juniper supplement left 2
– dcd

Here comes Google 

May 2015 saw Google outline its own approach to building a hardware and software ecosystem in support of IoT, including an embedded operating system for devices beyond smartphones and tablets for industrial devices, sensors, appliances and lighting, complete with its own communication stack and device administration API alongside a Google-developed chip that uses the 60GHz spectrum to support near-field communications on wearable devices such as smart watches.

Few hardware and software companies – beyond those such as IBM, HP and Oracle, which also happen to be some of the biggest cloud service providers – have hosting facilities of their own within which they can integrate broader IoT service packages. This opens up considerable options for partnerships for specialist data center hosting companies that can provide the infrastructure elements required to store and process large volumes of data.

Early examples of the type of relationships being forged around IoT come from Huawei and T-Systems, the systems integrator arm of German telco Deutsche Telekom, and Citrix partnering with Amazon to deliver Project Octoblu, a combination of cloud-hosted software and hardware dedicated to the workforce automation of IoT applications and services.

Demand for data center capacity – whether leased colocation space and/or cloud service hosting – is likely to grow in parallel with IoT expansion. As long as data center owners can adapt their infrastructure to handle the strain without extending capex/opex too far, the IoT is a phenomenon to be welcomed, not feared.

This article first appeared in the DatacenterDynamics Magazine supplement The Need for Speed in conjunction with Juniper Networks 

If you are interested in learning more register now for our free webinar with OSISoft and contribution from Hewlett Packard.