Archived Content

The following content is from an older version of this website, and may not display correctly.

The internet of things will force enterprise data center operators to completely rethink the way they manage capacity across all layers of the IT stack, according to a recent report by the market research firm Gartner.


Gartner's definition of the internet of things – or IoT, as the company abbreviates it – is something that connects “remote assets” and pushes data between them and centralized management systems. Companies can integrate that data and those assets in their processes to improve utilization and productivity.


In the research firm's own words: “Those assets can then be integrated into new and existing organizational processes to provide information on status, location, functionality and so on. Real-time information enables more accurate understanding of status, and it enhances utilization and productivity through optimized usage and more accurate decision support.”


Where this becomes problematic for data centers is management of security, servers, storage and network, Joe Skorupa, VP and distinguished analyst at Gartner, said. “Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT,” he said in a statement.


For data center networks, the internet of things will basically mean a lot more incoming traffic. WAN links in data centers today are designed for “moderate” bandwidth requirements of human interaction with applications.


Data from multitudes of sensors will require a lot more bandwidth than current capacity.


This data will be generated by both enterprises and user devices. As devices continue to learn about users, they will generate even more data.


Of course a lot more data will mean a lot more storage will have to be provisioned in data centers. In addition to pure capacity, companies will have to focus on being able to get and use data generated by the internet of things cost effectively.


Because of the volume of data and the amount of network connections that carry it, there will be more need for distributed data center management and appropriate system management platforms.


“IoT threatens to generate massive amounts of input data from sources that are globally distributed. Transferring the entirety of that data to a single location for processing will not be technically and economically viable,” Skorupa said.


“The recent trend to centralize applications to reduce costs and increase security is incompatible with the IoT. Organizations will be forced to aggregate data in multiple distributed mini data centers where initial processing can occur. Relevant data will then be forwarded to a central site for additional processing.”


This will be a challenge for data center operators, who will have to learn to manage multi-site infrastructure as a homogeneous environment but still be able to monitor and control individual locations.


Data backup will become problematic because of both network bandwidth and backup storage capacity. Backing up all raw data will probably be unaffordable all together, according to Gartner.


This means companies will have to back up selectively, automating the process of deciding which data are valuable or necessary to keep. The automation of this process will become another big data challenge of its own.