Country missing? Please select your nearest region...
The recent US bank attacks spell concern for the Cloud, with hackers targeting servers to get more bang for their buck
16 January 2013 by Penny Jones - DatacenterDynamics
Some in the US point the finger at Iran. Another group called the Izz ad-Din al-Qassam Cyber Fighters, motivated by the US Government’s inability to remove an anti-Muslim video called the Innocence of Muslims, claimed responsibility for the recent Distributed Denial of Service (DDoS) attacks which have brought down US banking sites since September.
But the identity of the perpetrator behind these recent events is only of secondary concern in this story which has been gracing US headlines for months now. This is because at the moment, for the banks that were attacked, the problem all lies in the Cloud.
Since September last year, the attack has affected some of the world’s biggest banking names including Wells Fargo, the Bank of America, Citigroup and HSBC. The attackers did not make away with personal data, or commit any form of fraud but they did move DDoS off of the PC and into the remote server, where they could push forth with new improved artillery, powered by faster performance and better and more network connections.
Those who point the finger at Iran say their reason for blame lies in the sophistication of the attack, but security company Imperva’s CTO and co-founder Amichai Shulman says to some extent, launching an attack from the server, especially when the Cloud is involved, can be easier and even more importantly more cost effective.
“Basically the attackers still use compromised PCs. They use these PCs to search for vulnerable servers and then exploit these, injecting code into the server so that from that time on, the attackers control the servers from a central location, usually behind an anonymizer,” Shulman says.
If the attack only relied on PCs, Shulman says 10 to 100 times more compromised PCs would be required then servers to launch an attack of a similar magnitude.
“It is more complex managing 100,000 PCs or even 10,000 than managing those compromised servers. Once they can reduce the management complexity they can reduce costs and increase their ability to launch operations on a more frequent basis.”
According to security firm Radware’s VP of Security Solutions Carl Herberger, who was talking with the American Banker, banks have never seen such large-scale DDoS attacks. Radware has been working with banks and cloud computing providers following the attacks, which have risen with the increased uptake of cloud adoption by the financial services industry.
Herbenger says one unnamed bank with enough internet capacity to handle 40bn bytes of data saw nearly twice that amount of traffic as a result of the DDoS onslaught.
"The multiplying of the flood is unbelievable," Herberger told American Banker. "Their servers, processors and offloading devices simply could not handle this problem."
Has this not been though of before?
Security, you would think, will always be top of concern for a financial services player. But the Cloud has made security much more difficult a promise, according to both Shulman and Herberger. “Cloud increases the risk because it is easier to use by the attackers and harder to mitigate by the bankers,” Shulman says.
Herberger says the main problem comes from banks’ leasing of cloud services, an approach that ties together the facilities of the banks and cloud computing providers. This makes it more difficult to block data from a particular internet address when an organization comes under cyber attack. He says eventually such attacks could be used for distraction for more malicious and fraudulent activity.
Shulman says in the past, banks (which are no stranger to DDoS attacks) have overcome the DDoS threat by installing higher amounts of bandwidth. “But you cannot over allocate network bandwidth just because there might be the possibility of someone launching a large attack at some time. It is just too costly,” Shulman says.
“The bank’s primary risk is its data set, or financial fraud, and they are well prepared for that. But this is another technique coming up, and the threat is a very real threat. One thing to remember though is that while these banks have suffered from the recent attacks, there wasn’t a single attack that actually took down one of the banking applications for an entire day.”
A new challenge
This could be good news but Shulman says in the world of the hacker it can also mean another challenge – and that, in the long run, means more persistent attacks.
Shulman says Imperva has been studying this new trend in its own labs and that every day, he sees attackers targeting a new vulnerable type of server, often finding hundreds and thousands of potential victims. “They keep collecting compromised servers, and in some cases they will lose some – but it means for the industry overall there is clearly a higher risk,” Sulman says.
Shulman says the recent attacks highlight the risk to anyone using a web service, right down to the small and medium-sized business user. “If you have a web server or web application in the enterprise, you are going to be the target of attackers, even if you don’t have valuable information in your server. Just having enough bandwidth and the server makes you a target,”
In some instances the trade-off for added security, will have to be latency as data travels through more security. “The consequence could be that all traffic going in and out of a compromised server would eventually be blocked by security devices along the way,” Shulman says. The real question then – at least for now - will be how latency stands up to denied access when services are given a long-term view?