As people around the world increasingly lean on computers to do their bidding — banking, exchanging messages or sharing too many cat photos — the data generated must live somewhere. That’s why data centers, vast warehouses of digital information, are increasingly cropping up in Texas and around the country.
But those centers, packed with powerful computers, suck huge amounts of energy from the power grid, costing tech companies millions of dollars in utility bills and expanding their carbon footprints.
Now, the University of Texas at Austin and the Japanese government are combining forces to tackle that problem.
University and Japanese officials will join Texas Secretary of State Carlos Cascos on Tuesday to announce a roughly $13 million project aiming to make data centers more energy efficient.
“This project is urgently needed,” university President Gregory Fenves said in a statement. “We are ever more dependent on data, and at the same time, ever more conscious of the need to utilize all sources of energy.”
The effort will be hosted by the university’s 14-year-old Texas Advanced Computing Center, which supports research projects across the sciences and is home to one of the most powerful supercomputers in the country. The project will give the center about $4 million in additional computing capability, and researchers will examine the efficiency of the equipment.
The Japanese government, through its New Energy and Industrial Technology Development Organization, will foot the bill for virtually the entire project, which involves installing a 250-kilowatt solar farm to power the new computers on sunny days. For the university, the payoff is obvious: more computing power. Japan, meanwhile, gets to study the technology to shave costs and energy use elsewhere, with the help of Texas researchers.
“Through this project, we hope to verify the energy efficiency of the new technology and to disseminate it in the U.S.,” Fumio Ueda, director of the Japanese agency, said in a statement.
In 2013, U.S. data centers sucked up some 91 billion kilowatt-hours of electricity, according to a report last year by the National Resources Defense Council and Anthesis, a global consulting firm. That’s enough to power all homes in New York City twice over for a year. At current rates, consumption will surge to 140 billion kilowatt-hours by 2020, the report said.
The data frenzy has spread to Texas and its occasionally stressed power grid. Last month, for instance, Facebook broke ground on a $1 billion data center in Fort Worth, the largest of several such centers in North Texas.
“If we’re going to build large-scale computers, we’re going to need more and more energy to do it,” Dan Stanzione, executive director of the university computing center, said in an interview. “We have to find sustainable ways to do that.”
The project is particularly timely, Stanzione added, because it comes just days after President Obama signed an executive order creating a “National Strategic Computing Initiative,” which partly emphasizes energy efficiency, and unveiled his “Clean Power Plan,” a controversial effort to shrink the nation’s carbon footprint by reshaping its energy sector.
The Texas-Japanese partnership will test a high-voltage direct current power system for computers, which typically run on alternating current. That technology is expected to boost efficiency by avoiding costly conversions of the current at the solar panels, a battery backup system and computing racks.
The new technology will power just a portion of the entire lab, which can require up to 100 megawatts of electricity. With new data centers cropping up across the country, finding a way to shave a data center’s energy use by just a few percentage points could make a huge difference, researchers say
“Small changes in efficiency there have massive consequences in savings,” Stanzione said.
Disclosure: The University of Texas at Austin is a corporate sponsor of The Texas Tribune. A complete list of Tribune donors and sponsors can be viewed here.