Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


Is your data center infrastructure future-proof?

  • Print
  • Share
  • Comment
  • Save

Don Beaty from DLB Associates outlines the challenges of long-term planning for data center engineers

The world is rapidly changing, and the tension in the data center industry has definitely ratcheted up to the point where it is creating toxic conflicts, according to professional engineer Don Beaty. If anything, the perennial problem of facility managers not talking to the IT professionals – and vice versa – has deepened and coalesced around the three disparate domains of hardware, software and IT, he said.

Beaty is the founder and president of DLB Associates, founded in 1980, where he has a bird’s eye view of the evolving data center as an insider. To put things into perspective, the global firm has designed, commissioned and provided operations support services for both small and large data center clients, including ten of the largest Google data center campuses worldwide.

DCD spoke to Beaty to better understand the considerable challenges of designing a modern data center and how to overcome them.

Don Beaty, DLB Associates

Don Beaty, founder and president of DLB Associates

Inherent tension between hardware and software

A key contributor to this tension revolves around the fact that the three areas identified above have different life cycles, said Beaty. Where hardware facilities and hardware are designed and built to last years, software updates could be pushed out to servers any time, he noted, drawing a comparison to the frequent software updates that smartphone users receive.

“Every time there is an update in one of those, it changes the demand on the hardware and the demand on the software. And so the life cycle mismatch creates major tension that is difficult to resolve,” he explained.

Of course, an ability to accurately predict future requirements could offer a way to better align the disparate life cycles that Beaty identified. Yet aside from the fact that it is pretty much impossible to know how IT requirements will look 10 years down the road, doing so also seem pointless when one considers the preference towards upgrading as opposed to paying up-front to meet future needs.

Moreover, the resource demand imposed by a particular piece of software can vary dramatically depending on consumer demand, or how popular it proves to be, said Beaty. This unpredictability further exacerbates the difficulty of figuring out the “worst case scenarios” for a data center during the design phrase.

Though he was careful not to call the problem an insurmountable one, Beaty did emphasize that finding a solution would necessitate changing the way the industry thinks, and being able to look at it from a different perspective.

“I think it creates a significant pressure on those involved because it means that they need to be much more knowledgeable of all the things that are going on, and in the way of hardware and software changes,” he told DCD.

The impact of new technologies

In addition, new deployment paradigms and technologies are further muddying the picture. For example, Beaty said that orchestration and abstraction of compute workloads is making itself felt in the data center.

“If you talk to people that design facilities, you will never hear them say orchestration. They don’t typically use the word abstraction. Yet orchestration and abstraction are really the driving force behind everything,” said Beaty.

Other factors that he ticked off include the impact of the Internet of Things (IoT) in the data center, as well as cloud-centric software stack such as OpenStack. Deployment strategies were mentioned too, whether it is a public, private or hybrid cloud deployment.

“There’s nothing to say, based on smart grid and IoT and this kind of stuff… it’s going to double the kW, double the racks,” he observed. “There are no metrics. No one is really able to translate that into what that means.”

Beaty pointed out that these metrics could also be very different depending on business verticals and geographical region. But what about opting for standardized hardware, or turn-key designs or solutions that can be purchased and deployed as-it-is without incurring additional consultation or design fees?

“The argument that people will make for having manufactured product is exactly the argument that can be made against it,” he said. “A product fundamentally, it is a business that focuses in on investing or developing a product, and selling it so that that the product has good ROI [Return On Investment].”

On that basis, Beaty argued that no one is going to design a product with a ROI of “one month”, alluding to how they will necessarily have to incorporate trade-offs which data centers operators will have to identify before deciding whether it is acceptable. “The bottom line there is no perfect solution” he said.

Affordable future proofing

It is tempting to put Beaty off as a doomsayer for all the difficulties and gloom that he has highlighted, except that he offered a suggestion that may just work. According to him, data centers need to be designed with what he called “affordable future proofing”, which treads the line between doing nothing at all, and attempting to somehow predict software and IT demands five to 10 years down the road.

Hearing it from him, the linchpin to his strategy entails communication with the relevant software or DevOp people who have good visibility into what is going to happen in the near to mid-term (1-3 years). While the information that they can provide will likely to be high level in nature, Beaty thinks it will be adequate for a data center designer to translate and incorporate these insights into the design of a future-proofed data center.

As it is, the design will be done “not with full expenditure, but with some sort of future-proofing, some sort of provisions,” he explained. This presumably help to keep costs down, yet offering ample leeway to upgrade at the right time.

Ultimately, the bottom-line is that facilities can no longer be seen as a stand-alone discipline that exists as a silo.

“I think what that means is that we are in a new era. Where everyone needs to become much more familiar with the other domains,” said Beaty. “Especially the facilities people, they really need to be well versed in hardware and software. So that they can help create affordable future-proofing.”

You can hear more from Don Beaty on the topic of data center infrastructure at the DCD Converged conference in Singapore next week. DCD Converged SE Asia takes place in Marina Bay Sands Singapore on 15 to 16 September, as part of Singapore Datacenter Week. More information on the multi-track conference programme and the expo can be found online and requests for information can be directed to Stephanie Chiang.

Have your say

Please view our terms and conditions before submitting your comment.

  • Print
  • Share
  • Comment
  • Save


More link