Data centers are thirsty for Texas’ water, but state planners don’t know how much they will need
Subscribe to The Y’all — a weekly dispatch about the people, places and policies defining Texas, produced by Texas Tribune journalists living in communities across the state.
ARLINGTON — Sai Abhideep Pundla has been awake since 3 a.m. After a red-eyed flight from Las Vegas, where he briefed data center company executives and local government officials about the future of artificial intelligence, he’s back in a lab at UT-Arlington, tinkering with a prototype he thinks could solve one of the industry’s biggest challenges: how to keep data centers cool without draining finite water supplies.
Pundla, a doctoral candidate in engineering, is testing a system that cools the computer servers using a recirculating chemical refrigerant instead of water.
It’s a timely innovation. Texas is building dozens of massive data centers — some as large as New York’s Central Park — and experts say they’re expected to guzzle millions of gallons of water a year in a state facing an increasingly urgent water crisis.
Every snap of a photo, message sent, or Google search requires data, which has to go somewhere. That “somewhere” is a data center.
These massive facilities filled with servers that store and process everything we do online keep our digital lives intact. However, keeping all that data takes electricity to power the data centers and cooling systems to keep their equipment from malfunctioning. And both of those require water.
/https://static.texastribune.org/media/files/72178110b5bc28d50d813cc95b41be16/0305%20El%20Paso%20Water%20JH%2092.jpg)
Related Story
Running Out: Texas’ water crisis — and the path forward
While data centers currently consume a small portion of the state’s total water supply, according to the state’s water plan, some researchers warn that with droughts and population growth, data centers have the potential to help push water supplies to the brink — especially in Texas’ more arid regions.
The Houston Advanced Research Center, an independent nonprofit research organization focusing on sustainability solutions, estimates that existing data centers in Texas will consume approximately 25 billion gallons of water, or 0.4% of the state’s total water use in 2025.
By 2030, this demand could increase up to 2.7% of the total annual water use in Texas. That’s the equivalent water consumption of 1.3 million average U.S. households.
The state currently has more than 400 data center facilities, with about 70 more on the way.
“[It] may not feel like it's a whole lot at the state level,” said Margaret Cook, a researcher studying data center water use at HARC. But if massive data centers locate in small communities, she said, they may not be able to handle big jumps in water demand.
/https://static.texastribune.org/media/files/828eb41c76955e09e1f045ce64b45b6d/redo%2020250923%20OpenAI%20Texas%20REUTERS%2002.jpg)
In the Texas Panhandle, for example, Amarillo residents recently held a community event to oppose five planned data centers and inform others of the potential risks to the Ogallala Aquifer, the region’s main water source that is being drained faster than it can be replenished.
“Our water sources aren’t as reliable as what would be needed for the data center,” said Madison Boyle, an event organizer.
This growing concern has highlighted the state’s limitations in tracking and forecasting water usage by emerging industries. In Texas, companies are required to report historical water consumption, but aren’t required to report how much water they expect to consume in the future or where it would come from, making it difficult for communities to plan or better manage their water supplies.
Other states are starting to act on data centers’ water use. California lawmakers recently passed a bill, which is awaiting the governor’s signature, that will require new data centers to report their projected water use before they start operations, while in Minnesota, data center developers are required to consult with the state’s environmental agency to make sure their proposed location has an adequate water supply.
Some data centers in other states are built near lakes, rivers or in colder climates where there’s natural free cooling. In Texas, many data centers are located in areas where water supplies already are under high stress, according to research by Yi Ding, an assistant professor at Purdue University’s School of Electrical and Computer Engineering.
“You could consume the same amount of water in Texas and Iowa,” Ding said. “But the environmental burden is different because Texas is more dry.”
In the lab in Arlington, Pundla and his colleague Braxton J. Smith flip on power to a server packed with memory modules, processor chips and fans — the same components that make up massive data centers. A hum kicks in as vapor from a special liquid cools the system. No water involved.
Pundla said the industry knows their water and energy consumption is unsustainable.
“The thermal management side of data centers is now trying to play catch up” as the industry develops bigger, more powerful computing systems that need more electricity and generate more heat, he said.
The urgency to find solutions is growing. Earlier this year, President Donald Trump announced a $500 billion federal push, dubbed Stargate, to build AI data centers and their supporting power infrastructure in Texas. Without smarter cooling, Pundla’s engineering professor Dereje Agonafer said, it may come at a steep environmental cost: intensifying water scarcity, forcing homes and farms to compete with industry and disrupting aquatic biodiversity.
How data centers use water
Data centers use water in two main ways: directly, to cool their equipment, and indirectly, through the electricity they rely on.
The current Texas data center market requires 10,234 megawatts of electricity a year, which is equivalent to more than 8 million homes’ usage a month, according to the research firm Baxtel that tracks data centers and their development. Data centers are expected to help drive Texas’ power demand to nearly double by 2030.
Electricity generation also requires water, particularly to cool the natural gas, coal or nuclear power plants that generate most of the state’s electricity. That means the more electricity a data center uses, the more water is indirectly consumed.
Cook said whether data centers choose to connect to the grid or build their own power supply — some in Central Texas already are building on-site power plants — will impact their water needs for cooling.
The bigger driver of water use is the need to cool the data center’s servers, which run 24/7, generating tremendous heat that can cause servers to fail if they’re not properly cooled.
Over the years, different methods have evolved to solve the cooling problem. Traditional systems use arrays of big fans to blow cool air across the machines, but that requires a lot of electricity and the fans are notoriously loud.
Agonafer, the UT-Arlington engineering professor, said air cooling can account for up to 40% of an average data center’s total energy demand.
Seeking a more energy-efficient solution, the industry started exploring cooling servers with water. One method is through evaporative cooling: The hot air generated by servers is blown through a water-soaked material. As the water evaporates, it pulls heat from the air, which is then circulated around the servers to keep them from overheating.
There’s also liquid cooling, a method more AI data centers are adopting. Instead of cooling the air, this method cools the hardware directly. Cold plates are attached to individual chips, and a liquid — usually a mix of water and a chemical coolant — circulates through the plates, pulling heat away at the source.
There’s also closed-loop cooling systems, a technique that will be used at one of the world’s largest AI data center now being built outside of Abilene as part of Stargate. In a closed-loop system the cooling fluid — the Abilene project will use water — circulates in a sealed circuit. The fluid absorbs heat from the electronic components, gets turned into vapor, then is condensed back into liquid, continuously recirculating through the system.
The data center will initially use 8 million gallons of Abilene city water to fill the cooling system. Jeremiah Bihl, an assistant director at Abilene’s water utility, said that total is minimal compared to the city's average daily use of 22 million gallons.
After that initial fill, the system will need some additional water, but Bihl couldn’t say how much or how often. He added that occasionally the system will also need to be completely drained and refilled for maintenance. But he said the center’s water use isn’t expected to put a deep dent in the city’s overall water supply.
“We may be the white elephant of all the other AI centers out there, just because of the kind of cooling system that [the data centers] have,” he said.
Tracking how much water data centers use is complicated
Texas’s water planning process is built on historical data. Long-term projections of the state’s water needs by the Texas Water Development Board rely on past usage trends and historical reported water use.
Experts say this model can leave the state’s water supply vulnerable to fast-moving growth of new industries like data centers.
“If you're having an industry that's having some kind of exponential growth in the state or just growing leaps and bounds, that historical data doesn't give you enough to go on to plan for future use,” said Julie Nahrgang, executive director of the Water Environment Association of Texas.
For example, generative AI, such as ChatGPT, uses graphics processing units, which are more complex than typical computer processors and generate more heat — which requires more water for cooling, said Ding, the Purdue professor.
The state has encountered similar issues before. When hydraulic fracturing first took off in the state’s vast oil fields, state planners were caught off guard by the immense water demand, said Carlos Rubinstein, a water expert and a former commissioner for the Texas Commission on Environmental Quality.
He said the industry found alternatives like using brackish groundwater and reusing water.
“We managed it okay, but you can only keep managing it okay for so long,” Rubenstein said.
A similar scenario is unfolding now with data centers. There is no uniform method for reporting or tracking their water usage, and as a result, the state can’t accurately predict their future impact on an increasingly precious natural resource, according to experts.
“We have a shortage [of] water, but on the other side, we like the Texas economy,” Rubenstein said. “It's a tough balance, but we are coming to terms with the fact that water in Texas is the most limiting factor, and we need to find an answer [for meeting both].”
In 2024, the state water board tried to get a handle on data centers’ water consumption by sending surveys to nearly 70 data centers asking how much ground and surface water they used each month and who was supplying that water to them. Even though a response was required by state law, only one-third of them responded.
The consequences for noncompliance are minimal — a Class C misdemeanor with a maximum fine of $500.
In practice, the system relies on voluntary compliance or self-reporting. TWDB then validates the data in those reports. If the state knew what kind of cooling system each facility used as well as its electricity consumption, it could better estimate water consumption. But experts say companies do not always disclose that information for proprietary reasons.
In response to these challenges, the Water Development Board said it’s participating in industry conferences, reviewing new research, and working to switch data centers to another category of water user to better track them in future water plans. However, experts say data centers will likely not be reflected in state planning documents until 2032.
Cook, the HARC researcher, said she’s heard from nervous regional planners and groundwater conservation experts. Planning based on historical data “means that all of the planning that they've done is inaccurate,” Cook said. “We need to speed up.”
Doctoral students take on newer cooling technology
Inside the UT-Arlington lab, a small team is gathered around a table where the inner workings of a data center cooling machine have been laid out on a white table piece by piece.
Unlike the Abilene data center’s closed-loop cooling system, this technology uses a chemical mixture instead of water and cools the hardware directly. The refrigerant circulates through a plate mounted directly to the processing unit, the brain of a computer. As the liquid absorbs heat, it evaporates, carrying heat away before being condensed and recirculated, forming a closed loop.
This change from liquid to gas and back to liquid is a far more efficient way to remove heat than traditional air or liquid-only cooling systems, said Agonafer, the engineering professor.
"It's kind of like a refrigerator," said Smith, the engineering doctoral student.
The new technology has cut the energy an average data center uses to cool its equipment from about 40% of total energy use to 5% during lab testing. Though the current demo involves just one server, plans are in place to scale to 40 as the team fine-tunes liquid flow rates, temperature control, and pressure to minimize energy use.
It’s part of a larger, federally backed effort to transform how data centers are cooled. With a $2.84 million grant from the U.S. Department of Energy’s COOLERCHIPS program, UT-Arlington is leading one of 19 projects aimed at slashing the energy and water demands of data center infrastructure.
Latest in the series: Running Out: Texas’ Water Crisis
Loading content …
As part of the project, Accelsius, an Austin-based company specializing in data center cooling technology, donated a rack with servers for students to test their system. The project is close to being ready for commercial use, Agonofer, the professor leading the student team, said.
Pundla, the doctoral candidate, said the technology is already drawing interest from data center companies. When the project is complete the students plan to launch a start-up company to sell the technology, with the university where it was developed as shareholder.
While there are no regulations in Texas requiring companies to adopt such systems, companies selling cooling system technology say data centers have a big incentive to bring down their power demand.
Texas law now requires large load consumers including data centers to consult with the state's main grid operator and provide details on how much power they plan to use and whether a project is planned in areas where the grid is already facing supply constraints. During this process data centers join a waitlist.
Finding ways to cut energy is more cost-efficient, said Liz Cruz, chief marketing officer at Accelsius. She said the companies she works with prefer to use electricity for computing, not cooling.
“Data centers are huge energy and water usage hogs,” said Liz Cruz, chief marketing officer at Accelsius. “So it is our responsibility as the builders of [cooling systems] to make the most sustainable decision.”
Disclosure: Google, Houston Advanced Research Center and the University of Texas - Arlington have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune's journalism. Find a complete list of them here.
Three featured TribFest speakers confirmed! You don’t want to miss Deb Haaland, former U.S. Secretary of the Interior and 2026 Democratic candidate for New Mexico governor; state Sen. Joan Huffman, R-Houston and 2026 Republican candidate for Texas Attorney General; and Jake Tapper, anchor of CNN’s “The Lead” and “State of the Union” at the 15th annual Texas Tribune Festival, Nov. 13–15 in downtown Austin. Get your tickets today!
TribFest 2025 is presented by JPMorganChase.
Jayme Lozano Carver contributed to this report.
Information about the authors
Contributors
Learn about The Texas Tribune’s policies, including our partnership with The Trust Project to increase transparency in news.