
Photo by ANDREW CABALLERO-REYNOLDS / AFP via Getty Images
When water rights are secure and transferable, new demands can be met without harming existing users.
Data centers are booming. As companies race to build the infrastructure behind artificial intelligence, communities across the country are sounding alarms about how much water these facilities consume. Some worry the massive complexes will strain water supplies that are already stretched thin.
The debate has quickly turned toward regulatory fixes, such as disclosure mandates, environmental reviews, and limits on how facilities operate. But the problem isn’t how much water data centers use—it’s how water gets allocated among competing demands. And as with any resource-allocation problem, the best answer isn’t mandates. It’s well-functioning markets.
Data centers use water in two main ways. First, they use it directly to cool servers through evaporative cooling systems or closed-loop recirculating systems. Second, they use water indirectly through the electricity they consume, since the power plants that generate that electricity often require water for cooling.
By some estimates, data centers account for roughly 0.2 percent of U.S. freshwater consumption, most of it indirectly through electricity generation. Compared with other uses, the numbers are relatively modest. Data centers in Arizona—home to major facilities from Google, Microsoft, and Meta—used roughly 905 million gallons of water in 2025. By comparison, Phoenix-area golf courses consume about 29 billion gallons annually. Agriculture uses far more still: irrigating a single acre of crops can require hundreds of thousands of gallons of water each year.
Continue reading the entire piece here at Reason
______________________
Shawn Regan is a senior fellow at the Manhattan Institute.
















