Major concerns water use big tech and semiconductors
The explosive growth of AI and quantum computing is causing increases in energy and water consumption.
Published on March 4, 2025

Team IO+ selects and features the most important news stories on innovation and technology, carefully curated by our editors.
The rapid growth of the semiconductor industry and data centers for AI threatens to put a heavy strain on water resources worldwide. Producing advanced chips and cooling systems requires large amounts of ultra-pure water. While factories consume millions of gallons daily, comparable to a city of 60,000 people, the tech industry is improving water recovery and recycling processes.
With the rise of AI, the pressure on water resources becomes even more remarkable. AI data centers can use up to 19 million gallons per day, comparable to the needs of a city of 50,000 people. Climate change exacerbates this problem, with expectations that by 2030 nearly half of new facilities will be located in exceptionally high water risk areas. This combination of factors threatens not only local environments, but also the global semiconductor supply chain.
The impact of chip manufacturing on water supply
Semiconductor manufacturing requires a complex six-step process that consumes enormous amounts of ultra-pure water. Every 1,000 gallons of ultra-pure water requires 1,400 to 1,600 gallons of municipal water. A large semiconductor plant producing 40,000 wafers per month consumes 4.8 million gallons of water daily - equivalent to the annual water consumption of 60,000 residents. TSMC, the world's largest chip manufacturer, is building a new facility in Phoenix, Arizona, where they promise to reuse 65% of the water used. This illustrates the growing focus on water conservation and the enormous scale of the problem: semiconductor factories collectively consume as much water as Hong Kong, a city of 7.5 million people.

Making the Wadden Islands sustainable: innovation from water
The Wadden as a sustainable energy supplier: it can be done, but it still requires a lot of preparation, say TNO researchers Floris Taminiau and Sam Lamboo.
Data centers exacerbate water crisis
The explosive growth of AI technology is creating a new water crisis. Modern data centers consume between 11 and 19 million gallons of water per day for cooling, comparable to a city of 30,000 to 50,000 people. Microsoft's global water consumption increased by 34% during the development of their first AI tools, while one data center cluster in Iowa consumed 6% of district water while training GPT-4. In the UK, these developments raise acute concerns: the first AI growth zones are being planned near new water reservoirs, raising questions about the sustainability of this strategy. Experts warn that without intervention, the development and use of AI could cause irreparable environmental damage.
Microsoft is testing underwater data centers through Project Natick. These data centers, placed on the seafloor, are energy efficient, sustainable, and reliable due to natural cooling and fewer outages occur.
Future risks and climate change
The situation is made even more worrisome by climate change. According to recent research, 40% of existing semiconductor facilities in 2030 and 2040 will be located in areas with high or extremely high water risks. Of the new facilities announced since 2021, 40 to 49% will be in high-risk areas. This development threatens local water resources and creates systemic risks for the global semiconductor industry. In England alone, a shortage of five billion liters of water per day is expected by 2050. Water companies like Thames Water are warning that data centers may face usage restrictions during heat waves.