Smart factories are a major talking point for technology vendors. Rarely do you find a concept that brings together all the buzzworthy trends of the past decade, including Edge computing, the Internet of Things, artificial intelligence (AI), and even 5G.

The Edge crowd likes to talk about the need to process data locally; the IoT community goes on about collaborative robotics and the need for networked sensors to monitor and measure; meanwhile, AI enthusiasts wax lyrical about predictive maintenance and using computer vision to conduct optical inspections of the goods.

Some experts have suggested that manufacturing facilities will need sizeable 'Edge' IT deployments to enable all this functionality - but is this a case of wishful thinking? Or will factories of the future simply send all their data to the cloud?

Robert Schmid, chief futurist at Deloitte Consulting, told DCD that many factories have squeezed all the efficiency they could from existing methodologies and management product: “We help give customers visibility across the factory floor, and then see where new efficiencies can be gained.”

This feature appeared in the DCD Magazine Edge Supplement. Read the full issue for free here.

Embracing the new

Deloitte has identified seven use cases for emerging technologies that can be seen as ingredients for a smart factory, which it showcased in its Virtual Duck Factory project – an interactive demonstration of real-time problem solving, based on augmented reality tech (see box).

It won’t surprise you to find out that several of them relied on AI. But will this new tech mean more demand for Edge computing? Coming up with the answer is hard, and here’s why: most AI workloads can be separated into two distinct stages, training and inference.

Training is the difficult part, requiring a sizeable quantity of correctly labeled data and tons of compute in order to create a machine learning model that can identify certain patterns without being explicitly programmed to do so.

For example, show a machine thousands of images of oranges, and it work out rules to recognize an orange - on an image it has never seen before. You don’t have to specify the shape or the color; the model will learn by observation, just like humans do.

Training demands powerful processors, whether they are CPUs, GPUs or any of the more recent types of silicon designed specifically for the purpose. Andrew Ng, one of the deep learning pioneers and former chief scientist at Baidu, revealed that training the company’s Chinese speech recognition model required just four terabytes of training data, but a whopping 20 exaflops of compute.

Inference, however, is easy: in this process, take those rules, and put them on a simple platform. Give it new data and it makes a simple judgment: in our case, it’s ‘orange/not orange.’ Inference workloads are optimized for performance and are perfectly suitable for execution on the under-powered hardware of a smartwatch, a smart camera, or a smart speaker.

It follows that equipment in smart factories are doing a ton of inference, but they won’t really deal with training. That will most likely be carried out in the cloud, where resources are cheap, and available on a pay-as-you-go basis.

“Often, you can contain what happens within the factory on the Edge computing device. You don’t have to necessarily connect to the cloud, and many manufacturers actually appreciate that. You will find that factory environments often don't even have the outward bandwidth to deal with the data volume, so they want to keep it within the factory,” Schmid said.

“But the moment you want to do significant machine learning [training], then you need the cloud. When you want to learn between factories, you do need the cloud for that. It’s just not cost-effective for people to buy the compute power that’s needed for the few times that they train models.”

The hardware question

In order to deliver on the promises of AI and IoT, smart factories will need a reliable network and some on-premises compute. The problem is the manufacturers aren’t standardizing on traditional IT equipment like the telecommunications industry did.

Back in the day, telcos relied on proprietary, specialized and often expensive hardware. In the past five years or so, the industry embraced the benefits of virtualizing the hardware functions and running them on top of commodity servers, and cloud software. This created a welcome source of revenue for traditional data center vendors.

But it doesn’t look like the manufacturing industry is moving in the same direction; instead of familiar 19-inch pizza boxes, we’ve seen an explosion of ruggedized form-factors equipped with exotic connectivity options. It might be x86 equipment inside, but it comes in all shapes and sizes, and is very far removed from a traditional server closet.

“Many of the hardware manufacturers have realized that data center build-outs are not worth selling to the corporations anymore, many corporations are going to the cloud. For them, Edge computing is really the new frontier,” Schmid said.

He highlighted research by IDC suggesting that by 2022, more than 40 percent of organizations’ cloud deployments will involve an element of Edge computing. While cloud vendors are indeed cannibalizing on-premises data centers, they are unlikely to make a play for the actual edge of the network.

“When I think of Edge computing, and the way we talk about it, it’s not just smaller servers anymore; we have Edge computing in sensors, we have Edge computing in networking gateways; we are getting to the point where we have so much compute around us – and it’s all Edge!”

That familiar feeling

It turns out that when deploying a smart factory, you can run into the same issues that you’ve seen in a corporate data center.

“My team brings together both IT and OT people,” Schmid said. “For example, we have a factory up in Chicago that manufactures packaging materials. First, we have to have IT folks come in and install Edge computers on the factory floor. And they need to upgrade the network.

“Operational technology folks also have a big part in this, because we have a lot of primary sensors on the machines today, but we often need secondary sensors, new connectivity, and to connect in new ways that we haven’t seen before. Machine-to-machine has been around for a long time; all the OT guys are looking at us and smiling, because they have done this for a while.

“It’s not so much of a technical challenge, much more of a human challenge, having these teams work together for the first time across the IT/OT divide.”

In a nutshell, we’ve got the applications to make a factory smart, and infrastructure deployment isn’t an issue; what’s missing is the common infrastructure model to support smart factories, the way the new telco infrastructure model enables innovative services in an age when its traditional voice and messaging services are no longer making any money.

The duck factory proves just how useful such a common model could be. Smart factories are coming. They are just taking their time to get here.

“I stopped talking about IoT and artificial intelligence, and network edge, and sensors; I would much rather talk about smart factories. Because you need all these components for many of the different use cases. All of these are enabling technologies that allow us to get to the smart factory,” Schmid said.

For more Edge coverage, be sure to read our free supplement covering smart cities, healthcare, vehicle-to-vehicle communication and more.