
Moftakhari's team developing more efficient tools for data center cooling
Wednesday, February 11, 2026
Media Contact: Tanner Holubar | Communications Specialist | 405-744-2065 | tanner.holubar@okstate.edu
With vast amounts of data and the continued rise of artificial intelligence, data centers are becoming increasingly critical. In the United States alone, data centers contribute $2 trillion to the economy.
There is also an increased need to make data center cooling systems smarter and more resilient. This is the topic of a three-year research project in the College of Engineering, Architecture and Technology’s School of Mechanical and Aerospace Engineering, funded by the National Science Foundation and in collaboration with Pennsylvania State University.
Dr. Ardeshir Moftakhari, an assistant professor in MAE, is the Oklahoma State University lead on a project titled “Collaborative Research: Novel Detection and Diagnostics Approach for Robust Design and Reliable Operation of Critical Cooling Infrastructure in Data Centers.”
Data centers play a key role in national security, public health, and data protection for government agencies, industries and academic institutions.
But the increased reliance on these data centers has led them to operate around the clock. Thousands of servers generate heat, which must be cooled. Cooling data centers accounts for nearly half of their energy consumption. Data centers also account for 4% of total energy use in the United States, which is expected to rise to 12% by 2030 as demand for digital services increases.
This research will combine expertise in mechanical engineering, control systems and computer science. Moftakhari is collaborating with Drs. Romulo Meira-Goes and Wangda Zuo of PSU, as well as Lawrence Berkley National Laboratory, Schneider Electric and HVAC manufacturers, to ensure the research is rooted in practical needs and moves seamlessly from the lab to real data centers.
Servers only function within a certain temperature range, and every computation stored generates heat. If servers are not reliably and consistently cooled, data and equipment are put at serious risk.

One problem can trigger a chain reaction that shuts down a facility within minutes.
“Overheating can damage servers, corrupt data and interrupt critical services, leading to millions of dollars in losses and extended downtime,” Moftakhari said. “Even more concerning is the risk of ‘cascading failures,' where one small fault triggers a chain reaction across the entire facility. For example, a mechanical failure or a cyberattack that manipulates sensors or controllers can disrupt cooling operation, quickly spreading overheating throughout the data center.”
Moftakhari’s team is developing advanced cooling technologies and intelligent fault detection systems for data centers. This will help catch problems earlier and identify faults with the potential to compromise operations. The tools will also help understand how faults spread through a cooling system, as well as automatically cool system operations in real time.
“This means the cooling system can stay stable and effective during major disruptions — whether they’re caused by equipment failures, power issues or even cyber-related problems,” Moftakhari said. “The goal is to make data center cooling robust and prevent minor issues from escalating into major, large-scale failures — keeping data centers safe, reliable and continuously operational even under unexpected or adverse conditions.”
Data center cooling systems are typically controlled using automatic rule-based logic or data-driven, AI-based approaches. But these methods are sophisticated, often require massive amounts of training data and technical expertise, and respond to problems they have seen before.
“Our approach is different. Instead of just reacting when something looks ‘off,’ it’s designed to understand how the whole cooling system works, how one problem can trigger others, and how to adjust the system in real time to compensate for the disturbance,” Moftakhari said. “That means it can stay calm during disruptions, adapt on the fly, stop problems from snowballing, and keep them running when it matters most.”
Moftakhari’s team will create a digital twin framework to test and validate their process. A digital twin is a virtual representation of a physical object. This allows the team to run realistic simulations to see how quickly a data center could shut down if certain cooling conditions are not met.
With this digital twin, they will safely explore how the cooling system and fault-detection controls perform under realistic and extreme scenarios. They will combine this with hands-on lab hardware-in-the-loop testing at OSU to ensure the solutions are also validated in real-world conditions.
“The digital tool gives us a full, system-wide view of the data center, helping pinpoint where problems are most likely to start and how a single issue could trigger chain reactions or unexpected failures in the cooling system,” Moftakhari said. “At the same time, the physical testbed lets us keep a close eye on real equipment, test responses safely and fix or replace components before they become a problem, making the whole facility easier to monitor, maintain and protect.”
The digital twin being developed through this research lays the groundwork for the next generation of data center cooling modeling and analysis tools. Designers, operators and manufacturers will have better ways to test ideas, evaluate risks and ensure systems are designed to withstand rare failures or cyber-related disruptions.
Into the future, data centers will face greater computing demands, energy constraints and concerns about reliability and security. This research will make data centers more innovative, adaptive and resilient to unforeseen challenges.
"As our everyday lives become more digital, data centers are powering everything from cloud services and streaming to artificial intelligence, health care systems and national security infrastructure,” Moftakhari said. “At the same time, these facilities are getting larger, more complex, and more energy-intensive, which makes keeping them cool safely and efficiently a much bigger challenge.”