Tape storage is one of those technological hangovers from the early days of computing, associated in the minds of many with rooms full of massive mainframe cabinets. Somewhat like the mainframe, tape shows no signs of going away just yet, and ironically, could even be handed a new lease of life thanks to the burgeoning volumes of data that are being accumulated in modern data centers.
Modern tape storage is a world away from the movie cliché of huge tape reels spinning backwards and forwards as the computer chomps its way through some complex computation. Today’s tape drives use cartridges capable of holding up to 15TB of data, and are more often used for backup or archiving purposes than for online storage.
Storage has changed
However, few in the data center industry can have failed to notice the rapid changes that have been taking place at the storage layer of the infrastructure. Flash-based solid state storage devices have enabled new tiers of low latency storage with higher IOPS, while hard drive makers have responded by continuing to push the storage density of rotating media, driving the cost per gigabyte ever lower.
The end result is that the cost of disk storage has fallen to a level where many businesses have begun to use disk-based backup systems where once they would have used tape drives or tape library systems. In addition, cloud-based storage services such as Amazon Glacier have proven attractive to businesses of all sizes for off-site storage of backup or archive data because of the low cost per gigabyte and the ease of transferring data over the Internet.
This does not mean that tape storage is about to vanish. For one thing, many regulated industries such as the finance or legal sectors have strict regulations which require providers to retain data and to be able to prove that its content has not been altered. Modern tape systems offer a write-once-read-many (WORM) capability that delivers this, and for this reason, tape is often mandatory for archiving data.
There are other reasons why tape is likely to be around for some time, according to Clive Longbottom, service director at analyst firm Quocirca.
“The biggest one is still investment protection: the cost of large tape libraries and robotic retrieval systems is high, and just dumping these because disks are now cheap (but failure-prone) is just not a good financial argument,” he said.
“Then there is the ongoing cost. Sure, spinning disks are becoming cheaper and cheaper to acquire. However, keeping the disks spinning has a large ongoing operational cost due to the required power for spinning. A tape, once written, is close to zero cost – it holds its data until it is needed again. Hard disks can be spun down, but rarely are,” he added.
Keeping archiving affordable
Meanwhile, the shift towards cloud-based services for storage has simply moved the problem from the business to the cloud service providers. While the enterprise tape market has declined each year, cloud service providers are turning to tape as the optimal solution for backing up the ever expanding volumes of customer data they are storing, or for actually delivering archive services to customers.
“Cloud providers have a bit of a problem: they have put heavy focus on the incredible scale of their storage capabilities.
The trouble is that customers have fallen for the message. Therefore, the big players are looking at a need for zettabytes of storage capability to meet customer expectations,” said Longbottom.
Fortunately, a large proportion of this data is unlikely to be accessed ever again, so if the service provider can figure out what data is likely to be accessed, that can go onto disk while the bulk of it can be written to tape with SLAs stipulating that some data may take an hour or more to be recovered.
Amazon does not say what technology its Glacier service uses, but it is widely believed that it is based on tape storage, simply because the retrieval times quoted to customers are as much as several hours.
Tape is well suited for archiving or long-term storage as it offers by far the lowest price points of any storage medium, with a raw storage cost of around $0.02 per gigabyte, and also boasts a potential longevity of several decades if stored under conditions of low temperature and humidity.
In the past, there were many competing tape formats, but most of these have largely given way to Linear Tape Open (LTO), which was developed as an open standard not controlled by any single vendor. IBM and Oracle still have their own proprietary formats while also supporting LTO.
LTO has been through multiple iterations, with LTO-7 introduced in 2015 delivering a native capacity of 6TB per cartridge, or up to 15TB with data compression. The next generation, LTO-8, is expected later this year or early next year, and is anticipated to boost native capacity to 16TB, with up to 32TB possible using compression.
IBM’s 3592 series of tape systems has likewise been through multiple iterations, but the firm has recently introduced the sixth generation in the shape of the TS1155 Tape Drive, which offers a native capacity of 15TB, or up to 45TB using the 3:1 compression ratio that IBM quotes for the technology.
There is no sign yet of an end to increased tape capacities. Most recently (July 2017) IBM and Sony have pushed the record tape density to 200Gbits per square inch in an experimental process which uses sputtering, new tape heads and lubricant. This could lead to a theoretical maximum of 330TB in a single standard palm-sized tape cartridge, half the size of a 60TB SSD.
Compatibility is a key concern for technologies that will be used for long-term archival of information. For this reason, the LTO Consortium enforces strict rules to ensure that any LTO drive can read cartridges from the two preceding generations as well as its own, and can write data to cartridges from the previous generation. IBM’s TS1155, for instance, supports existing JD and JC tape cartridges.
If tape vendors can continue to boost storage density, and keep the price per gigabyte of tape at rock-bottom levels, there is no reason why the old medium should not continue for several more decades for backup and archive.
“An enterprise with just less than a petabyte of data should focus on disk-based backup and archive. Greater than that, and I’d be looking at how and where tape could possibly play,” said Longbottom.
This article appeared in the August/September issue of DCD Magazine. Subscribe to the digital and paper editions here.