In the 1927 film Metropolis, the people are ruled by industrialists who keep their tired subjects oppressed by making them perform long hours of tedious work. The film is filled with images of gloom – dark gothic buildings, large menacing industrial machines, worn clothing and long faces, exhausted from the daily grind. The film was meant to depict the worst outcome of the Industrial Revolution.
Fast forward to 2013 and the idea of another Industrial Revolution could not be more different – a focus on health, grandeurs of green and workloads performed by barely lifting a finger through the use of software and innovative ways of incorporating this into infrastructure. Welcome to GE’s Industrial Internet, the revolution it says is waiting in the wings to once again change our lives forever.
The Industrial Revolution that began in the 1760s brought about the building of machines, fleets and new physical networks. It wasn’t until the Internet that communications were pushed as being one of the most important tools for business, creating yet another revolution. The Industrial Internet combines these areas, providing a new converged approach that ties together machines and data in a way that requires network, data center and software every step of the way. Intelligence will help to ease the work we do, increase the lifespan of the infrastructure used (and those using it), provide new levels of safety and efficacy, and make us more responsive to business and customer needs.
“The world is on the threshold of a new era of innovation and change with the rise of the Industrial Internet,” GE says in a white paper on the topic. “It is taking place through the convergence of the global industrial system with the power of advanced computing, analytics, low-cost sensing and new levels of connectivity… The deeper meshing of the digital world with the world of machines holds the potential to bring about profound transformation to global industry and to many aspects of daily life, including the way many of us do our jobs.”
GE CTO Greg Simpson, who oversees GE’s data center operations, puts the idea into perspective. He says GE is only touching the surface of what can be done with the Industrial Internet today, and the data center is fast-becoming its first frontier in this pioneering effort. “We see a lot of opportunities to improve our customers’ ability to optimize their business and get use out of their data. We are really only at the dawn of the Industrial Internet. By leveraging this data we can affect huge change in the industry, and it doesn’t even take a huge change to have a huge impact.”
GE says there are now millions of machines – from simple electric motors to the highly advanced computed tomography scanners used in healthcare, along with tens of thousands of networks (think railways) and fleets (from power plants to aircraft) that hold the potential to change the way the industrial world operates. By using sensors and the Internet, these can all perform better, not only for better asset reliability and optimized inspection and repair but more efficient provision of services overall.
It estimates, for example, that a 1% saving in fuel from today’s aircraft, which can be brought about by the monitoring of vehicle parts, could bring about savings of US$30bn over 15 years for the aviation industry alone. For the oil and gas industry, a 1% reduction in capital expenditure can bring about $90bn in savings. And this is just based on a 1% ratio. It says these figures can be increased the more the Industrial Internet is adopted.
“We have only begun to tap this potential,” Simpson says. “There are so many new ways of using this data. It could be data off a turbine engine. This could be combined with weather data to give you completely new insight. GE has done a nice job over the years on the industrial side of things, and understands what requirements there are for the type of system we run commercially. We are trying to take that knowledge and deliver it over a public or private internet to our customers.”
GE has already released nine new industrial service technologies through a joint venture with Accenture that covers machine-to-machine, machine-to-people and machine-to-business operations through the use of optimized networks, optimized plants and facilities, optimized assets, and service quality and productivity. These cover the energy, oil and gas, healthcare, rail, aviation and manufacturing industries. GE estimates these services could eliminate about $150bn in waste across these industries alone and plans to launch a further 20 new services technologies this year around this offering. This means Simpson and his data center teams need to be prepared.
Industrial cloud
The Industrial Internet relies heavily on the Cloud, according to Simpson, and this has changed the way GE views its operations.
GE is currently consolidating its global data center footprint using a tiered approach designed with the Cloud in mind. It plans to have strategic data centers set up as pairs in North America and Europe that will also link to specific country data centers where required. The largest of these so far is 85,000 sq ft. It also plans to have a strategic data center in Asia that will pair with various country centers.
Simpson says GE cannot escape having more localized operations for regulatory or regional purposes, and in some cases it also requires compute operations on the plant floor for real-time data acquisition.
“The core strategic data center will be the largest one that will hold the greatest amount of our inventory. The second tier will be the country centers and these will provide support for regional operations, then there will be operations in manufacturing plants,” Simpson says.
Some of these facilities include high-performance computing (HPC) systems, others banks of processors that can be combined globally to provide a more distributed HPC capability. And while GE builds its own strategic data centers, some country facilities will be housed with external providers.
While most facilities will be built to a Tier III+ standard, different levels of tiers will be used throughout each data center, depending on need. Systems such as enterprise resource planning and customer-facing systems may be instrumental to the operations of GE or a customer, but HPC, with thousands of nodes, can run with fewer high specs.
It is all about agility, which Simpson says is integral for the delivery of cloud-based services such as the Industrial Internet. “Big data is really changing the face of how we need to work and think about our data center strategy,” Simpson says. “We need to figure out where the Cloud makes sense. We have a highly elastic SAN [storage area network], we are scaling our private cloud and leveraging the public cloud where it makes sense from external providers for capacity. We really think about that as providing the compute capability for cloud, and the data centers are just the fundamental elements that support that.”
One of the key analytical roles Simpson must carry out with this approach is around provisioning inside the data center itself.
“You have to be constantly looking around the corner to understand where the future will be. Take resiliency, for example. The technology has changed dramatically. Traditionally, everything was in a physical box dedicated to a physical application and you had to have lots of provision for failover. In the virtual world you have a lot of tools to move your virtual workloads around and to get a degree of availability that is quite good, but you have to know what true availability is,” Simpson says.
The Cloud also allows Simpson’s team to achieve elasticity for demand when big data is pushing the limits and trial and change platforms then instantly new applications out when required.
Being software aware
Agility in GE’s data centers is being designed right the way down to the technology used inside. Simpson says this is largely where the new developments for GE will lie as the Industrial Internet takes hold. Just like the data sets being collected by sensors from client and GE operations, much of this will rely on software, especially for storage and networking.
GE currently operates multiple tiers of storage and uses tools provided by storage vendors to spread the data center across high-cost, high-performing storage down to the lowest-price data drives, depending on how appropriate each data set is and the level of availability required. It also uses public cloud storage – Amazon’s Glacier – for data archiving and backup.
“Storage is one of our biggest challenges from a big data perspective. The volumes we have to store are so large with the Industrial Internet. The problem is, if you want to invest in storage, a year from now you could be looking at a completely new technology that could be better. Storage is an asset that doesn’t depreciate so fast, but it does become obsolete. You have to know when to leverage a new technology.”
Simpson looks forward to having greater access to tools that allow for better manipulation of data sets for storage – tools that make decisions based on application characteristics.
GE has already been working on common application programming interfaces that can work between all types of cloud, automatically determining the type of fabric each application wants to work with. It is working with Open Stack as a component of this, using tools such as Eucalyptus, the private and hybrid cloud platform. The open source platform is compatible with Amazon Web Services (AWS) and as such allows for workloads to move between AWS and GE’s own data centers.
Eventually, Simpson says he sees storage becoming software-defined, as he does everything else in the data center.
“At the end of the day, what we ultimately want is the ability to provision capacity dynamically and automatically, without human intervention. This is what gives you real cloud capacity,” Simpson says.
The challenge is not likely to get any easier for Simpson and his team. Apart from grappling with the advances of the software-defined data center and finding ways to deliver this to make the Cloud more efficient, they will also have to keep up with advancements from GE. It was only in November 2011 that GE announced it was opening a new software center in Northern California to develop code for the Industrial Internet. Now GE is moving ahead with service offerings in this space. As momentum catches on, the data sets Simpson is likely to be dealing with will most likley become more demanding.
Simpson is aware that he may have trouble finding the right skills for managing the Cloud and carrying out data analysis, which proves that even automation requires human intervention. But this time, it will be software that carries out all the hard work.
This article first appeared in FOCUS magazine, Issue 29 – out now! Read the digital edition in full here.