Simulation Helps Data Centers Manage Three Key Industry Challenges
Sherman Ikemoto, Managing Director at Future Facilities explains how simulation can help data center operators manage the accelerating pace of business, rising densities, and energy consumption.
In this edition of Voices of the Industry, Sherman Ikemoto, Managing Director at Future Facilities explains how simulation can help data center operators manage the accelerating pace of business, rising densities, and energy consumption.
The modern data center industry has three painful absolutes: the accelerating pace of business, rising densities, and overwhelming energy consumption.
While these three trends may initially seem like isolated challenges, I’d like to explore the nuances of how these trends relate to and influence one another, as well as why operations teams that traditionally work independently—particularly at legacy enterprise, government, and colocation data centers—might find their efforts undermine the work of other silos more than they know.
Ultimately, I believe that the way in which these trends intertwine indicates that the need for simulation and a new approach to data center management has never been more important in the history of the data center industry. This approach needs to be based on an understanding of the impact data center management silos have on one another and the ultimate objectives of data center operations.
Accelerating Pace of Business
The accelerating pace of business and rising densities are trends that drive each other. Simply put, the pace of business is now the market demand to move more of our lives into the virtual world.
The world’s digital transformation drives IT manufacturers to maintain Moore’s Law—halving the cost of computing capacity every 18 months—and produce a seemingly never-ending sequence of cheaper and more powerful computing products that have led us to where we are today, i.e., what we are calling high-density IT equipment.
The demand for digital transformation increases, which drives the pace of application and IT hardware development, which in turn drives manufacturing densities. We are in a cycle that is described by Jevons paradox, where trends reinforce each other.
We’re seeing paradigm shifts in density that couldn’t be contemplated a decade ago. The problem here, however, is not just the new high-density equipment that operators must figure out how to implement in their data center, but the variation of equipment density across any one given data center.
Take, for example, a legacy government data center. The density in one zone of the room managing website traffic could be relatively low, while in another zone that hosts AI or ML applications, it could be running anywhere from 15 to 20 kW per cabinet. The original data center layout likely never accounted for densities of such magnitude and with such variation across the white space.
To clarify, the power density might be what was expected of the data center design on average, but the variation across racks, servers, and so on between zones—not so much. Breaking from original design considerations pits data center management silos against each other, which ultimately puts IT at risk and slows down the service delivery time. Operations teams have a lot more to think about and coordinate, leading to a significant ripple effect across the data center.
Increasing energy consumption is a byproduct of this paradox. More demand and more powerful IT systems mean more energy use. The evolution of IT technologies from discrete to converged to hyper-converged architectures has driven an increasing usage of energy consumption in the data center industry.
However, it’s worth noting that out of this paradox is greater benefits to humanity per unit of energy used. This in and of itself is good.
I’d like to raise a question that I think we should start exploring as an industry: are the potential benefits great enough to offset the negative impact of energy consumption? In other words, will our data centers become capable enough to help us solve the climate crisis before we wipe ourselves out?
Data center simulation, or creating a digital twin of a data center, presents a path forward. Any change—high-density IT implementation, new cooling technology, you name it—made to a data center can first be simulated. By virtualizing the data center in this way, you can visualize and ultimately optimize its performance.
This means that you can ensure every data center is working as efficiently as possible, maximizing the use of available capacity without risk to IT equipment. This not only makes the best use of available company resources but also enables teams to avoid having to build a new data center before absolutely necessary. Because, really, the greenest data center is the one you never have to build.
Sherman Ikemoto is Managing Director at Future Facilities. Future Facilities provides software that allows users to create digital twins of their data centers. Contact them to learn more about the company’s simulation and virtualization products.