Ever since data centers emerged as a distinct class of infrastructure, financial organizations have been their most demanding client, pushing technology vendors to develop new hardware and software. It’s fair to say that financial services have had a major hand in shaping data centers - and in every development inside their doors.
When colocation data centers appeared, the major financial centers such as London, Frankfurt and New York were among the earliest locations. As the industry developed, more processing power was required to carry out the calculations powering increasingly complicated instruments and services.
At the forefront
Financial organizations were also among the first to make the reliability demands that have led to the expanding science of backup and disaster recovery. A financial services operation can lose millions for every minute of downtime, so a high level of investment in products that can prevent that is easily justified.
Performance was obviously important - but network speeds became particularly vital. Data had to be transmitted in vast quantities, and with low latency. It became very important to close deals quickly before the markets changed. A few milliseconds of difference in transmission times between one financial center and another can be crucial - a fact which has led to more than one new investment in submarine fiber optic cables across the Atlantic and other oceans.
And the reliability of those communications also had to be guaranteed. Cash machines (ATMs) rely on connections to core infrastructure to deliver money and advice to customers. Many of the same services have migrated to websites, which must be 100 percent reliable - and proofed against tampering.
Financial services also led the adoption of new methods of storage and processing. In a departure from their conservative public image, banks ended up spending their time and resources pursuing new technologies and turning them into a competitive advantage. Open source databases and Big Data tools like Hadoop found some of their earliest adopters amongst bankers and brokers. A system which can scan millions of transactions can potentially spot a fraud before a customer’s account is compromised.
Finance has also enabled one of the most extreme developments in current data center practice: when confidence in the world’s financial systems hit a new low after the financial crash of 2008, the concept of Bitcoin emerged.
Working in the blockchain gang
Supported by the blockchain distributed ledger system, in which all nodes store all the data, Bitcoin appeared to be using technology to sidestep the tainted systems of global finance. Miraculously, it appeared to provide something which could simultaneously serve as a secure and private medium of exchange for transactions, and a fast-appreciating commodity for speculators.
Looking into the future, it will be hard to square these two roles for cryptocurrency. The distributed trust mechanism has inherent difficulties, based on its fundamental requirements. Because all transactions are stored everywhere, scalability is limited, and the Bitcoin blockchain can currently only handle around seven transactions per second. In contrast, payment company Visa handles 24,000 transactions per second.
Despite these drawbacks, Bitcoin has become the fastest growing application in data centers worldwide. New facilities are being built to run cryptocurrency mining, and they have to do it with manic energy and ever-increasing efficiency. New cooling techniques and faster GPUs or ASICs are thrust to the fore in the hope that these shoestring sites can steal a march on their competitors.
The irony is that this competition is self-defeating. The reward for the effort is ever-diminishing returns, and an unrestrained growth in energy consumption that could conceivably consume all the world’s cheap electricity for no genuine benefit.
Against this background, we are seeing a somewhat predictable development. When technological advances seem to be at risk of becoming uncontrollable, legislation tends to emerge to rein in the excess.
While Bitcoin was conceived as a means to deliver financial transactions outside the traditional operators, the mainstream banking system has been responding to the aftermath of the financial crisis with a move towards tighter regulations designed to prevent a relapse. Service providers will have to raise their game, meet the new requirements, and deliver a new level of accountability.
Sarbanes-Oxley, MiFID II and the Basel rules have made specific demands on digital financial services. Not only must there be complete clarity about how transactions happen, data on infrastructure must also be collected, and reported to the authorities when required.
Those who pay the piper have always called the tune - even more so for those who write the tune and define the means by which the piper will be paid.
Finance has driven the development of data centers, and there is no reason to believe this will change any time soon.
This article appeared in the April/May issue of DCD Magazine. Subscribe to the digital and print editions for free here: