If you wait at the London Gateway port, countless ships will pass you by, behemoths groaning under their weight as they cut through the water.
Once upon a time, those ships would have been just an empty husk for transport, reliant on the wind and the people on board to keep the vessel charging forward. But as technology has developed, so has the complexity of the ships. While not yet ‘manless,’ the maritime industry is guided by the data it gathers on board and ashore, and no longer down to human judgment alone.
As we move towards a globally digitized fleet, those ships will need a complex system of digital infrastructure to keep them connected to the shore, to each other, and to process the information on-board.
Ships are now, if not mobile cities, certainly small mobile towns. The 2022 cruise ship, ‘Wonder of the Seas’ can host almost 7,000 people, all of whom expect Internet connectivity throughout. But the number of people demanding connectivity does not even begin to compare with the number of sensors gathering data, and those sensors are demanding to be heard.
Data is constantly being collected on board the ship. Sensors monitor the engine, fuel consumption, ship speed, temperature, and external data like weather patterns and currents.
According to Marine Digital, the modern ship generates over 20 gigabytes of data every day (though this is, of course, wildly variable depending on the size and purpose of the ship). The important takeaway is that this is not a simple undertaking, and there is no one-size-fits-all approach.
For ship management company Thome, there is less IT on-board than on-shore. “We treat all ships as a small office,” said Say Toon Foo, vice president of IT at Thome Group. “On most of our ships, we have at least one server.”
As a management company, Thome doesn’t own the ships it works with. Instead of attempting to process the data onboard, Thome processes the majority ashore, communicating with the crews via a very small aperture terminal (VSAT).
VSATs connect the ships by fixing on a geostationary satellite and can offer download rates between 256kbps to 90Mbps, and upload rates usually between 100bps to 512kbps. This pales in comparison to 5G’s 20Gbps download and 10Gbps upload, but there are no 5G masts mid-ocean.
“It's a good speed we have [with the VSAT], but not everything can run from the satellite, so we do need that server. But the VSAT means that if we do have a complication, we can share that with the staff on shore,” explained Toon Foo.
Good is, of course, relative. But, happily for Thome, the shipping management company doesn’t really need to process the data in real-time. Instead, the company relies mostly on daily or hourly data updates transmitted via the not entirely reliable VSAT, which are processed in its on-site server room, or in the majority of cases, sent to the cloud.
As an approach, sending most of the data to the shore to be processed seems to be the norm.
Columbia Shipmanagement uses a unique Performance and Optimization Control Room (POCR and/or performance center) as part of its offering. The POCR enables clients to optimize navigational, operational, and commercial performance by analyzing data both collected on board the ship and ashore.
“The ships are directly in touch with the Performance Center,” said Pankaj Sharma, Columbia Group’s director of digital performance optimization. “We proactively check the routes before departure, looking for efficiency, safety, and security. Then, as the vessel is moving, the system is monitoring and creates alerts which our team is reacting to 24/7.”
With over 300 ships to manage, much of this is automated into a traffic light system (green means good, and alerts only light up when the color changes).
Some of this is then processed on-site, but the vast majority is cloud-driven. “Right now we are on Azure, but we have also used AWS and we have a private instance where we have our own cloud hosting space,” added Sharma.
Edge on board
Having computational power onboard the ship is entirely possible, but it has challenges. There is limited room on a ship, and there are also weight limitations, while IT engineers or specialists are not the norm in the crew.
Edge system vendor Scale Computing designed a server to get around these issues, one that has been used by Northern Marine shipping.
“Looking at Northern Marine, initially they worked with traditional 19-inch rack servers on board the ships - two HPE servers and a separate storage box,” said Johan Pellicaan, VP and managing director at Scale Computing.
“Just over a year ago, they started to use an Intel-based enterprise Edge compute system, the Scale Computing HE150. This is a nano-sized computer [under five square inches).”
Scale’s offerings are based around tightly integrated micro-computers. The HE150 and HE151 are based on Intel’s NUC (next unit of computing) barebone systems, running Scale’s HC3 cluster software. They use significantly less power than a traditional 19-inch server, and take a tiny fraction of the space.
Traditional servers “need about 12 cores and six to nine gigabytes of storage in a three-node cluster as an absolute minimum. In our case, we need a maximum of four gigs per server and less than a core.”
This means that the Scale software has a lower overhead: “In the same size memory, we can run many more virtual machines than others can,” claimed Pellicaan.
Edge is really defined by the kind of work done rather than the location, so it is fair to say The shipping industry is using the Edge - be it on-board, or the Edge on-shore.
Automation - could we have unmanned ships?
In many industries, the next step for digitization is automation. In the maritime sector, this raises the prospect of unmanned ships - but Columbia’s Sharma explained that this would be complex to deliver, given the latency imposed by ship-to-shore communications.
“When we talk about control rooms, which would actively have an intervention on vessel operations, then latency is very important,” he said. “When you think of autonomous vehicles, the latency with 5G is good enough to do that. But with ships, the latency is much worse. We're talking about satellite communication. We're talking about a very slow Internet with lost connection and blind spots.”
The fact is that satellite connectivity is simply not fast enough to allow ships to take the step towards autonomous working and full automation.
“There is sufficient bandwidth for having data exchanged from sensors and from machinery, and eventually being sent to shore. But latency is a big issue and it's a barrier to moving into autonomous or semi-autonomous shipping.”
Much of this makes it seem like ships are at the end of the world, rather than at the Edge. But ships do not travel at dramatically fast speeds like other vehicles, so latency can be less of a problem than one might expect.
A relatively fast container ship might reach 20 knots (37km per hour), compared to an airplane which could reach 575 mph (925kmph), meaning that most of the time, hourly updates would be sufficient - but not always, there are plenty of incidents where fast responses are essential, and even then things can still go wrong.
For instance, in a highly reported incident in 2021, a container ship blocked the Suez Canal for six days. It’s worth exploring the incident to ask whether having more compute on board (even if it is only one server) might have helped avoid the problem.
Could on-board IT have helped prevent the Suez Canal blockage?
In March 2021, the ‘Ever Given,’ a ship owned by Shoei Kiden Kaisha, leased to Evergreen Marine and managed by Bernhard Schulte Shipmanagement ran aground at the Suez Canal in Egypt, with its bow and stern wedged in opposite banks of the 200m wide canal.
Blocking the major trade route prevented 369 ships passing, representing $9.6 billion in trade. The crash was put down to strong winds (around 74km per hour) pushing the 400 meter (1,300ft) ship off course, and speculation was made by the Egyptian authorities that technical or human errors may have played a role, although this was denied by the companies involved.
The weather is something that is not taken for granted in the Maritime industry. “Weather based-data was the first machine learning project we did in the POCR,” said Sharma. While this research was not focused on an incident like the Suez Canal blockage, Columbia did explore the impact of wind on efficiency.
“Weather is a really important factor,” explained Sharma. “A badly planned weather voyage can increase the fuel consumption by 10 to 15 percent, while a well-planned voyage might save five percent.
The company “did a project where we got high-frequency data from the vessel AIS position, and every 15 minutes we layered that with speed data, consumption data, and weather data. We then put this into a machine learning algorithm, and we got some exceptional results,” he said.
Instead of being able to work on a 20 or 30 degrees basis, the company was able to operate at 5 degrees. “It became a heat map rather than a generic formula and we could then predict the speed loss very effectively,” he said.
Evert Lataire, head of the Maritime Technology University at Ghent University in Belgium, conducted data analysis using ship tracking websites to find what happened in the Suez Canal incident, putting much of this down to the ‘Bank Effect,’ an effect of hydrodynamics in shallow waters.
DCD reached out to Lataire to find out whether he thinks that having more compute on-board, could potentially prevent disasters like the Suez Canal blockage.
Lataire’s research doesn’t require intensive compute power, real time data analysis can have a big impact on control. When a ship is out at sea, the data can be gathered around its position, but not the impact of forces on the ship.
“The surrounding water has an enormous impact on the ship and how it behaves. A ship sailing in shallow water will have a totally different turning circle compared to deep water, to a magnitude of five. So the diameter you need to make a circle with your ship will be five times bigger in shallow water compared to deep water.”
This is where the bank hypothesis came into account for the Suez Canal disaster. According to Lataire, the crew manning the ship will have been aware something was going wrong, but by then it would have been too late.
“Once you are there, it’s lost. You have to not get into that situation in the first place,” said Lataire.
On-board Edge computing could be enough to alert the crew, and the ship management company, that an issue was going to arise, but it is not yet able to predict, nor prevent, the outcome.
Lataire’s research generates calculations that can then be used to create simulations - but this isn’t currently possible in real-time on the ship. Lataire believes that autonomous ships will come to fruition, but will be limited to small city boats, or to simple journeys like those taken by ferries, in the near future. In the distant future, this could expand further.
The ‘manless’ ship is still a work in progress, but the digitized and ‘smart’ ship is widely in practice. By using a combination of on-board Edge computing, on-shore and on-premise computing, the cloud, along with VSAT for connectivity and geostationary satellites, the ships themselves and those controlling them can make data-driven decisions.
Until we can find a solution to the latency-problem for ships, automation will remain a pipedream, and sailors will keep their jobs. But with technological advances, it is only a matter of time.