🔴 AI Is Draining Water From Areas That Need It Most 🔴 We analyzed data on thousands of #AI #datacenters, and found that roughly two thirds of them since 2022 are in places with high to extremely high levels of water stress. With terrific reporters Michelle Ma and Dina Bass ⭐ 🎁 : https://lnkd.in/exrEaSWU Each time you ask an AI #chatbot to write an email, it sends a request to a data center and strains an increasingly scarce resource: water. We found that about two-thirds of new data centers built or in development since 2022 are in places already gripped by high water stress. In the US, data centers are increasingly built and planned in these dry areas, more than ever before. But this trend is unfolding globally. Arid regions like Saudi Arabia and the United Arab Emirates are welcoming more data centers than ever before. Meanwhile, in China and India, an even greater proportion of data centers are located in drier areas compared to the US. Some of these sites are literal deserts. Globally, data centers consume about 560 billion liters of water annually and that could rise to about 1,200 billion liters by 2030, as tech firms push for bigger facilities stocked with more advanced AI computing chips that run hot. Now tech companies are trying new solutions, including data center and chip designs that let them use less water. Some are placing hot chips directly on cold plates that use water or else submerging chips and servers in liquid, a process known as immersion cooling. Businesses are also experimenting with synthetic liquids to cool data centers. But some coolants are being phased out from the market because they use so-called forever chemicals, which don’t naturally break down and can persist in animals, people and the environment. As #SiliconValley mulls solutions, water advocates say tech companies need to be more transparent about the problem. Almost no information about data center water usage on an individual system level is publicly available. Jennifer Walker, director of the Texas Coast and Water Program at the National Wildlife Federation, also said state officials need more information for water planning. But when the Texas Water Development Board sent a water use survey to data centers, it received a lackluster response, she said. “We just had one of the hottest summers on record in Texas, and we've had several of those,” she said. “I’m concerned about any super water-intensive industry that is going to come into our state.” 🎁 Read for free here: https://lnkd.in/exrEaSWU
Environmental Concerns of AI Facilities
Explore top LinkedIn content from expert professionals.
Summary
The rapid rise of artificial intelligence (AI) comes with significant environmental concerns, particularly regarding the water and energy demands of data centers that power AI technologies. These facilities use vast amounts of water for cooling and consume immense energy, often straining local resources and contributing to environmental challenges.
- Promote transparency in water use: Advocate for tech companies to publicly share detailed information on water consumption at their data centers to encourage accountability and sustainable practices.
- Explore alternative cooling methods: Support the development and adoption of innovative cooling solutions, such as immersion cooling or water-free systems, to reduce the strain on water resources.
- Prioritize sustainable site selection: Push for data centers to be planned in locations with sufficient water and renewable energy availability instead of overburdening drought-prone regions.
-
-
To leverage AI for sustainability, it is critical that this technology itself continues to improve (reduce!) its environmental impact. Today, I am happy to share that Google published a first-of-its-kind study on the lifetime emissions of Tensor Processing Units (TPUs), and outlined how they have become 3x more carbon-efficient over the last 4 years! (Blogpost here https://lnkd.in/dVnuzaaf). But what are TPUs? They're specialized hardware accelerators that help advance artificial intelligence (AI). Their efficiency impacts AI's environmental sustainability. This progress is due to more efficient hardware design, which means fewer carbon emissions for the same AI workload. Here are some of the highlights: 🟢 Operational electricity emissions make up more than 70% of a Google TPU's lifetime emissions. So, this 3x operational efficiency gain is extra important!! 🟢 While manufacturing emissions are still notable and will increase as operational emissions decrease with the use of carbon-free energy. 🟢 We've also significantly improved our AI model efficiency (i.e. the software not just hardware), reducing the number of computations required for a given performance. 🟢 This is key for our strategy to run on 24/7 carbon-free energy (CFE) on every grid where we operate by 2030. These findings highlight the importance of optimizing both hardware AND software for a sustainable AI future. It's important to remember where AI has important implications for reducing emissions and fostering sustainability - ex. AI can optimize energy consumption in buildings, improve traffic flow, and develop new materials for renewable energy technologies. On a personal level, as someone who pursued a masters in environmental management with a focus on industrial ecology, I'm particularly proud to see this kind of full lifecycle / LCA review of AI :) By taking a holistic view, we can identify and address the biggest contributors to AI's carbon footprint. #Sustainability #AI #GoogleCloud #TPU #CarbonFootprint #TechForGood #Innovation #IndustrialEcology #LifecycleAssessment
-
Understanding the Ripple Effects of Our Digital Consumption: The Hidden Cost of HVAC Water in Technology In today’s digital age, our reliance on technology is more profound than ever. However, few of us stop to consider the environmental cost associated with our ever-growing digital demands—specifically, the substantial water usage by data centers, which house the servers running applications including AI-driven platforms like chatbots. Data centers are critical in ensuring the seamless operation of AI services, which power millions of virtual interactions daily. These facilities rely heavily on cooling technologies to manage the immense heat generated by continuous server operation. Among these technologies, cooling towers and HVAC systems are integral. Cooling towers help dissipate heat by evaporative cooling, which, while effective, consumes significant amounts of water. Moreover, the power generation required to keep these data centers operational further adds to water consumption. The combination of water for cooling and power generation paints a picture of substantial resource use. For instance, the industry’s average shows that for every kilowatt-hour of energy consumed, approximately 2 gallons of water may be utilized, combining cooling and power generation. Let’s consider a simple analogy to put this into perspective—a 50-gallon rain barrel, like the one pictured, could be drained in just over a minute by the water demands of a data centers handling around 10 million AI chats per day. This example illustrates not just the scale of water use but prompts us to think about sustainability in technology development. As professionals in the HVAC industry, we must push for innovations that not only meet our digital needs but also manage our environmental footprint responsibly. Exploring alternative cooling methods, improving energy efficiency, and investing in sustainable infrastructure are critical steps towards minimizing our ecological impact. Let us all be part of the conversation and action towards a more sustainable future in technology. Your thoughts and insights on how we can achieve this are invaluable.
-
It seems that there is one issue of bipartisan agreement during this week’s Presidential transition: federal support for building more AI data centers. In his final days in office, former President Biden issued an executive order directing the Defense, Energy and Interior departments to explore leasing federal lands for the construction of AI data centers. “We will not let America be out-built when it comes to the technology that will define the future,” Biden said in a statement. And in his first days in office, President Trump appeared to endorse Biden’s move, saying, “I’d like to see federal lands opened up for data centers. I think they’re going to be very important.” This is a big win for the AI companies that have found that their new technology is so power-hungry that it is straining existing resources. But this massive building boom could be a huge strain on our power and water systems in the United States. At Proof News, we've been reporting on the soaring climate costs of AI data centers. The Biden Administration executive order required developers of data centers to also build “new, clean electricity generation resources.” However, the order said nothing about mitigating data centers’ massive water usage and didn’t forbid data centers from also using fossil fuels. It’s also not clear if Trump will enforce the clean energy component of the order. To understand the environmental impacts of the AI data center boom, we’ve produced a three-part video podcast series interviewing experts and journalists about the climate impacts of AI. In our first climate video, I interviewed Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, who is the leading expert on AI’s extraordinary water usage. Ren said the collective water usage of all AI companies already exceeds the water usage of the biggest beverage companies in the world. (Video and transcript: https://lnkd.in/d7tdPfRQ). In our second climate video, I interviewed Dr. Sasha Luccioni, research scientist and climate lead at Hugging Face, about her groundbreaking research on generative AI’s electricity usage and carbon emissions. Luccioni said that general purpose AIs, such as chatbots, use 20 to 30 times more energy than task-specific AI models. (Video and transcript: https://lnkd.in/dqxFBU2A ). And in the third climate video, I interviewed the reporter on this series, Aaron Gordon about how little we know about the true climate costs of AI because of companies’ lack of transparency. (Video and transcript: https://lnkd.in/dJNrfVzA).
-
Research continues to show the high environmental cost of GenAI tool development and deployment. We’ve created this classroom guide to help educators get a better understanding and engage their students in thoughtful discussions on the potential impacts of GenAI on the planet. Researchers estimate that creating ChatGPT used 1,287 megawatt hours of electricity and produced the carbon emissions equivalent of 123 gas-powered vehicles driven for one year. It's development created substantial heat that required a significant amount of water to cool down those data centers – and for every 5-50 prompts it requires about 16oz of water. Generating an image can be especially energy-intensive, similar to fully charging your smartphone. Creating 1,000 images with Stable Diffusion is responsible for as much CO2 as driving 4.1 miles in a gas-powered car. Some researchers estimate the carbon footprint of an AI prompt to be 4-5 times that of a normal search query. And the impact of escalating use predicted by 2027 could mean AI servers will use as much electricity as a small country. Check out the carousel for more including discussion questions and further reading. Or download a PDF version for your classroom here: https://lnkd.in/eaCtnN3n AI for Education #aiforeducation #aieducation #AI #GenAI #ChatGPT #environment #sustainability
-
Demand for U.S. power is projected to grow rapidly for the first time in decades, driven by the power needs of Artificial Intelligence. AI is having a huge impact on many sectors of the economy, and already we are seeing significant growth in data center development based on expected power needs associated with AI. This power crisis likely will cause major challenges for utilities and those focused on eliminating the use of fossil fuels. Tech firms at the forefront of the AI revolution are starting to think about their power needs and making investments into non-fossil fuel sources, including nuclear. Alphabet Inc., Amazon, Meta, and Microsoft recently made announcements about bringing back decommissioned and/or building new nuclear facilities. Despite concerns about safety and waste disposal, supporters of nuclear power will be happy by the technology sector’s backing of this power source, including small modular reactors. However, a bigger concern was the recent announcement by FirstEnergy, an Akron, Ohio-based utility, that it plans to continue operating its Fort Martin and Harrison coal-fired plants in West Virginia, having previously announced it would close these facilities by 2030. It is likely that FirstEnergy will not be the only utility to break its pledge to decommission coal power plant in order to ensure it has enough power for future AI needs. This decision to extend the use of coal power plants is deeply troubling, particularly after witnessing the environmental impact from Germany’s decision to close its nuclear reactors following the Fukushima nuclear disaster. Short on power, Germany turned to lignite, the dirtiest type of coal, and in less than 5 years, Germany reversed its decades long trend of lower emissions, as CO2 emissions started rising again. The German experience is something that we don’t want to emulate in the U.S. We should not stand in the way of progress, and AI has the ability to have a positive impact on many industries and our daily lives. Nevertheless, we must be careful to ensure we are not moving backward on our environmental commitments. Increasing power needs from data centers and electric vehicles can, and should, be met by rapid growth in renewable energy. With renewable energy and energy storage prices continuing to decline, the U.S. should be able to meet its power needs without hurting the environment. https://lnkd.in/gHNaKYsr EcoTech Capital Cy Obert #cleantech; #climatetech; #energytransition; #sustainability
-
Weekly Monday Sustainability Post: A ChatGPT-powered search consumes almost 10 times the amount of electricity as a search on Google, according to the International Energy Agency (IEA). Recent The Washington Post and GreenBiz Group articles have explored the ramifications of this stunning increase in energy required for AI search (links in the comments to both). There are more than 2,700 data centers nationwide. They are run by obscure companies that rent out computing power to large tech firms, many of which have said they would erase their emissions entirely as soon as 2030. Goldman Sachs found that data centers will account for 8% of total electricity use in the United States by 2030, a near tripling of their share today. No wonder Google reported a 13% rise in greenhouse gas emissions for 2023 driven by the energy appetite of artificial intelligence and scarce availability of renewable energy in Asia and certain U.S. regions. Zooming in, one large data center complex in Iowa owned by Meta burns the annual equivalent amount of power as 7 million laptops running eight hours every day, based on data shared publicly by the company. Meta is also building a $1.5 billion data center campus outside Salt Lake City that will consume as much power as can be generated by a large nuclear reactor. In Omaha, where Google and Meta recently set up sprawling data center operations, a coal plant that was supposed to go offline in 2022 will now be operational through at least 2026. Part of the AI arms race is locking in enough power to fuel your AI activities. Coal plants are being reinvigorated to meet the energy demanded from AI activities, but Big Tech is hoping it can meet this demand with experimental clean energy projects that have long odds of successful scale anytime soon. These include fusion, small nuclear reactors, and geothermal energy. These companies are buying large renewable energy contracts. However, the data centers operate on the same grids as everyone else and regulatory filings show that utilities are backfilling these renewable energy purchases with fossil fuel expansions. On the plus side, AI can help identify solutions to speed up the carbon transition to act on climate. For instance, a Google and Boston Consulting Group (BCG) analysis found that AI has the potential to mitigate 5-10% of global greenhouse gas emissions, In the meantime, I'd say we don't need every search on Google and other platforms to by default be an AI search. #sustainability #datacenters #energy #emissions