How AI Models Affect Climate Change

Explore top LinkedIn content from expert professionals.

Summary

Artificial intelligence (AI) models are revolutionizing industries by optimizing processes and decision-making, but their growing energy consumption and environmental impact are raising concerns about their contribution to climate change. While AI shows potential to address sustainability challenges, such as optimizing energy use and reducing waste, the energy-intensive nature of AI models and data centers highlights the urgent need for more sustainable AI practices.

  • Rethink energy priorities: Focus on developing energy-efficient AI models that prioritize computational and energy efficiency during training and deployment.
  • Adopt sustainable infrastructure: Transition to renewable energy-powered data centers and employ innovative technologies like waste heat reuse and improved cooling systems.
  • Collaborate for solutions: Promote partnerships between industries, governments, and researchers to establish standards and share strategies for minimizing AI's environmental footprint.
Summarized by AI based on LinkedIn member posts
  • View profile for Anna Lerner Nesbitt

    CEO @ Climate Collective | Climate Tech Leader | fm. Meta, World Bank Group, Global Environment Facility | Advisor, Board member

    60,343 followers

    AI models are increasingly handling coding tasks. Like many, I assumed this would naturally lead to more energy-efficient code, with AI optimizing and avoiding anti-patterns. But new research reveals a paradox: AI-generated code often consumes significantly more energy than human-written code. A study on LeetCode problems found AI solutions consistently used more energy, with the gap widening for harder challenges – sometimes up to 8.2x the energy of human code. Why is this a major climate problem, especially as we rely on AI for sustainability? The Paradox of AI efficiency: We expect AI to optimize, but its current focus seems to be on functional correctness or generation speed, not deep energy efficiency. This means AI code can be functionally sound but computationally heavy. A scaled problem: Every line of code, whether on a local machine or a vast data center, requires electricity. If AI is generating code that's dramatically less efficient, the cumulative energy demand skyrockets as AI coding becomes ubiquitous. The bottom line: Inefficient code demands more processing power, longer run times, and higher energy consumption in data centers. These centers already consume around 1.5% of the world's electricity (415 TWh) in 2024, projected to grow four times faster than total electricity consumption. Inefficient AI code directly exacerbates this growth, potentially undermining any 'climate gains' from AI tooling. I genuinely believe AI can advance our sustainability targets faster, more cost-efficiently, and with better precision. However, if its outputs are inherently energy-intensive, it creates a self-defeating loop. We're increasing our carbon footprint through the very tools meant to accelerate efficiency. Going forward, we must integrate energy efficiency as a core metric in training and evaluating AI coding models, prioritizing lean, optimized code. Kudos to pioneers like Hugging Face and Salesforce, with their energy-index for AI models, and Orange for championing Frugal AI. And big thanks to the research team for looking beyond the hype: Md Arman Islam, Devi Varaprasad J., Ritika Rekhi, Pratik Pokharel, Sai Siddharth Cilamkoti, Asif Imran, Tevfik Kosar, Bekir Oguzhan Turkkan. [Post 1/2 on a reality check for AI's effectiveness and efficiency]

  • To tackle climate change, we need big solutions, and we need big solutions fast. Oftentimes, these solutions will come in surprising packages. One of my favorite examples is our work in contrails – you know contrails, those little fluffy white clouds behind planes? Surprisingly, according to the IPCC, they make up roughly 35% of aviation’s global warming impact. The great news is that contrails are relatively easy to avoid because they only form in cold and humid regions of the sky – so pilots can adjust their altitude to avoid them, just like they do for turbulence. Since Google announced our work with American Airlines and Breakthrough Energy, where we used AI to help American Airlines pilots reduce contrails by over half, it’s been amazing to see how the sustainability conversation in the aviation industry has shifted, thanks in large part to the dedication of Jill Blickstein, Dinesh Sanekommu, and Marc Shapiro. Contrail avoidance is now recognized in the aviation industry as another (much nearer term!) solution alongside needed innovations in electric planes, hydrogen planes, and biofuels. Our team recently released a paper with more details on our work with American Airlines. Here are key takeaways: 𝟭. 𝗦𝗶𝗴𝗻𝗶𝗳𝗶𝗰𝗮𝗻𝘁 𝗥𝗲𝗱𝘂𝗰𝘁𝗶𝗼𝗻 (seen in satellite imagery!): Flights that adjusted their routes based on our AI-based contrail predictions showed a 54% reduction in contrail kilometers, when compared in satellite imagery with control flights that didn't have access to AI predictions. 𝟮. 𝗣𝗶𝗹𝗼𝘁-𝗹𝗲𝗱 𝗔𝗱𝗷𝘂𝘀𝘁𝗺𝗲𝗻𝘁𝘀: Pilots made relatively small adjustments to their ascent or descent profiles to avoid contrail-forming regions, demonstrating a practical approach that integrates into existing flight operations. One of my favorite memories is that after flying American's first flight to avoid contrails, Captain John P. Dudley remarked that it was easy to avoid them, our predictions looked right based on all the contrails he saw in the sky, and best of all - he even came up with a new approach to contrail avoidance that we informally named after him  😊 𝟯. 𝗦𝗺𝗮𝗹𝗹 𝗙𝘂𝗲𝗹 𝗧𝗿𝗮𝗱𝗲𝗼𝗳𝗳: The study found a slight increase in fuel consumption per adjusted flight (around 2%). The great news is that only a small fraction of flights create contrails, so this likely scales to 0.3% additional fuel when scaled across an airline's fleet. 𝟰. 𝗖𝗼𝗺𝗯𝗶𝗻𝗶𝗻𝗴 𝗔𝗜 𝗮𝗻𝗱 𝗣𝗵𝘆𝘀𝗶𝗰𝘀: The approach we used to predict contrail formation utilized both AI from Google and physics-based simulation (thank you Breakthrough Energy!). Link to paper: https://lnkd.in/gxKHXCps What excites me most about this research is its ability to scale near-term. We still have important research to do, and we’ll share more about that in coming months - but compared to other climate solutions, contrail avoidance has the ability to scale in a matter of years, not decades. We need more solutions like this to meet the climate challenge.

  • View profile for Obinna Isiadinso

    Global Sector Lead for Data Center Investments at IFC – Follow me for weekly insights on global data center and AI infrastructure investing

    21,136 followers

    The most climate-aligned AI infrastructure isn’t in #SiliconValley. It’s in cold, quiet towns across the #Nordics... Where waste heat from data centers now warms 100,000+ homes. This isn’t a theory. It’s already happening: - #Finland: Microsoft’s data center will heat 40% of the city - #Norway: STACK is warming 5,000 homes with server exhaust - #Sweden: 30,000 apartments heated by Open District Heating How does it work? - Liquid cooling captures 90–95% of server heat - Heat pumps boost temps to 115°C - Heat flows into citywide district heating networks Why the Nordics lead: 1. Cold climate = natural cooling 2. Cheap, clean energy = fewer emissions 3. 70+ year-old district heating systems = instant circularity This model isn’t just about sustainability. It’s about resilience, energy security, and infrastructure ROI. And it’s the future of #AI infrastructure. Especially as power demand from data centers is set to double by 2030. The Nordics aren’t just storing data. They’re designing systems where compute powers communities. #datacenters

  • View profile for Scott Breen

    Association Executive / Sustainability and Circular Economy Expert / Environmental Lawyer / Project Manager / Policy Analyst

    13,285 followers

    Weekly Monday Sustainability Post: A ChatGPT-powered search consumes almost 10 times the amount of electricity as a search on Google, according to the International Energy Agency (IEA). Recent The Washington Post and GreenBiz Group articles have explored the ramifications of this stunning increase in energy required for AI search (links in the comments to both). There are more than 2,700 data centers nationwide. They are run by obscure companies that rent out computing power to large tech firms, many of which have said they would erase their emissions entirely as soon as 2030. Goldman Sachs found that data centers will account for 8% of total electricity use in the United States by 2030, a near tripling of their share today. No wonder Google reported a 13% rise in greenhouse gas emissions for 2023 driven by the energy appetite of artificial intelligence and scarce availability of renewable energy in Asia and certain U.S. regions. Zooming in, one large data center complex in Iowa owned by Meta burns the annual equivalent amount of power as 7 million laptops running eight hours every day, based on data shared publicly by the company. Meta is also building a $1.5 billion data center campus outside Salt Lake City that will consume as much power as can be generated by a large nuclear reactor. In Omaha, where Google and Meta recently set up sprawling data center operations, a coal plant that was supposed to go offline in 2022 will now be operational through at least 2026. Part of the AI arms race is locking in enough power to fuel your AI activities. Coal plants are being reinvigorated to meet the energy demanded from AI activities, but Big Tech is hoping it can meet this demand with experimental clean energy projects that have long odds of successful scale anytime soon. These include fusion, small nuclear reactors, and geothermal energy. These companies are buying large renewable energy contracts. However, the data centers operate on the same grids as everyone else and regulatory filings show that utilities are backfilling these renewable energy purchases with fossil fuel expansions. On the plus side, AI can help identify solutions to speed up the carbon transition to act on climate. For instance, a Google and Boston Consulting Group (BCG) analysis found that AI has the potential to mitigate 5-10% of global greenhouse gas emissions, In the meantime, I'd say we don't need every search on Google and other platforms to by default be an AI search. #sustainability #datacenters #energy #emissions

  • View profile for Daniela V. Fernandez
    Daniela V. Fernandez Daniela V. Fernandez is an Influencer

    Founder & Managing Partner of VELAMAR | Financing the future by making the ocean investable | Forbes 30 Under 30 | Founder of Sustainable Ocean Alliance

    44,694 followers

    Can you believe it has already been a year since #ChatGPT launched? Since its emergence, #artificialintelligence has captured global dialogue, from its potential #workforce impact to implications for education and art. But we’re missing a critical angle: AI’s #carbonfootprint. Examining ChatGPT’s usage can help us gain insight into its environmental impact. As of February 2024, the platform’s 100 million+ weekly active users are each posing an average of 10 queries… That’s ONE BILLION queries per week, each generating 4.32g of CO2. By plugging these estimations into an emissions calculator, I found that EVERY WEEK the platform is producing emissions roughly equivalent to 10,800 roundtrip flights between San Franciso and New York City (enough to melt 523,000 square feet of Arctic sea ice). Scientists have already warned the Arctic could be free of sea ice in summer as soon as the 2030s. And something tells me they weren’t factoring ChatGPT and other energy-demanding AI models into those projections. Further, this is based on estimated *current* ChatGPT use, which will only grow as society gets accustomed to the tool and as AI becomes more a part of everyday life. Some analyses indicate that by 2027, ChatGPT’s electricity consumption could rival that of entire nations like Sweden, Argentina, or the Netherlands. The platform is taking precautions, however, such as using Microsoft’s carbon-neutral #Azure cloud system and working to develop more #energyefficient chips—so it could certainly be worse. But, it could also be better. So let’s hold OpenAI accountable to mitigate their damage before it gets out of control. Join me in letting them know the public is watching their environmental impact and that they must responsibly manage the platform’s rapidly growing carbon footprint. (Pictured: Microsoft GPU server network to power OpenAI's supercomputer language model. Image courtesy of Microsoft/Bloomberg).

  • View profile for Julia Angwin

    Founder/CEO Proof News, New York Times Contributing Opinion Writer

    2,869 followers

      It seems that there is one issue of bipartisan agreement during this week’s Presidential transition: federal support for building more AI data centers. In his final days in office, former President Biden issued an executive order directing the Defense, Energy and Interior departments to explore leasing federal lands for the construction of AI data centers. “We will not let America be out-built when it comes to the technology that will define the future,” Biden said in a statement. And in his first days in office, President Trump appeared to endorse Biden’s move, saying, “I’d like to see federal lands opened up for data centers. I think they’re going to be very important.” This is a big win for the AI companies that have found that their new technology is so power-hungry that it is straining existing resources. But this massive building boom could be a huge strain on our power and water systems in the United States. At Proof News, we've been reporting on the soaring climate costs of AI data centers. The Biden Administration executive order required developers of data centers to also build “new, clean electricity generation resources.” However, the order said nothing about mitigating data centers’ massive water usage and didn’t forbid data centers from also using fossil fuels. It’s also not clear if Trump will enforce the clean energy component of the order. To understand the environmental impacts of the AI data center boom, we’ve produced a three-part video podcast series interviewing experts and journalists about the climate impacts of AI. In our first climate video, I interviewed Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, who is the leading expert on AI’s extraordinary water usage. Ren said the collective water usage of all AI companies already exceeds the water usage of the biggest beverage companies in the world. (Video and transcript: https://lnkd.in/d7tdPfRQ). In our second climate video, I interviewed Dr. Sasha Luccioni, research scientist and climate lead at Hugging Face, about her groundbreaking research on generative AI’s electricity usage and carbon emissions. Luccioni said that general purpose AIs, such as chatbots, use 20 to 30 times more energy than task-specific AI models. (Video and transcript: https://lnkd.in/dqxFBU2A ). And in the third climate video, I interviewed the reporter on this series, Aaron Gordon about how little we know about the true climate costs of AI because of companies’ lack of transparency. (Video and transcript: https://lnkd.in/dJNrfVzA).  

  • View profile for Heather Clancy
    Heather Clancy Heather Clancy is an Influencer
    20,908 followers

    DEEPSEEK, which is building technologies that would compete with OpenAI’s ChatGPT and other AI programs, claims to achieve the goals of larger AI companies with less money and far less energy. Specifically, DeepSeek engineers said they are using data analysis approaches that require far fewer chips than its rivals. That’s a cost advantage. What’s more, its open source models also require less computing power — and thus less power overall. That would be welcome news for energy demand. After years of economic and productivity growth on flat energy supplies, future demand, driven largely by the data centers serving the AI revolution, is expected to triple between 2023 and 2028, reaching 132 gigawatts annually and eating up 12 percent of U.S. electricity supply, according to Lawrence Berkeley National Laboratory. Many energy companies — particularly nuclear and natural gas providers — are counting on this increase. The prediction has also been good for renewable energy developers: their power purchase agreements with Amazon Google Meta and Microsoft have been great for business. What if we don't need as many new data centers as we thought? What’s bad for power providers is good for the climate. The DeepSeek breakthrough offers clear evidence that companies investing in AI services can throttle energy consumption by using smarter coding practices, such as limiting the size of data sets used to train their AI algorithms.  Here's my analysis of how corporate sustainability teams can be more proactive and managing the impact: https://lnkd.in/ebbkvpqi Stay tuned for more exploration of this topic throughout 2025.

  • View profile for Shashank Garg

    Co-founder and CEO at Infocepts

    15,750 followers

    Earlier today, I had a really insightful chat with one of our younger team members. He was pretty concerned about how we're not pushing AI enough to tackle the global climate crisis. The casual coffee conversation made me reflect on the AI Sustainability Paradox. As business leaders, we often see AI as an innovation powerhouse—optimizing operations, reducing waste, and driving smarter resource management. But let's be clear: AI isn't a silver bullet. It comes with its own challenges, particularly energy consumption and ROI justification. With 2023 recording the hottest temperatures, the climate crisis demands immediate action. The real question isn't whether AI can help—it's how we deploy it effectively without undermining sustainability itself.   At its core, AI is a system optimizer, helping businesses uncover inefficiencies and make data-driven decisions that drive sustainability. Whether it is AI-driven material discovery that identifies sustainable alternatives faster than traditional R&D or Precision Agriculture where AI optimizes water, fertilizer, and pesticide use - AI is truly a sustainability accelerator.   Here's the catch, though—AI is energy-hungry or that's what it seemed till DeepSeek rattled the world. Remember, the same AI models that optimize supply chains, also require massive computing power! Data center are not emission free zone.   It's the classic ROI dilemma: Would you invest in a machine that consumes 30% more energy if it improves efficiency by 45%? The same logic applies to AI—the key question is whether its sustainability benefits outweigh its energy costs.   Here are my two (read three) cents… 1. Optimize AI's Energy Use: Invest in energy-efficient data centers and cloud solutions to reduce AI's footprint. 2. Use AI to Reduce Carbon Emissions: AI can monitor emissions, optimize renewable energy storage, and automate energy management - helping is reduce the carbon impact! 3. Foster Cross-Industry Collaboration: Governments, businesses, and research institutions need data-sharing initiatives, to reduce the overall impact and to drive sustainable AI practices.   So what do you think - AI & Sustainability - A Powerful Duo or a Double-Edged Sword? Would love to hear from you.   #Sustainability #ArtificialIntelligence #SustainableAI

  • View profile for Andreas Welsch
    Andreas Welsch Andreas Welsch is an Influencer

    Top 10 Agentic AI Advisor | Author: “AI Leadership Handbook” | LinkedIn Learning Instructor | Thought Leader | Keynote Speaker

    33,233 followers

    𝗧𝗵𝗲 𝗱𝗲𝗯𝗮𝘁𝗲 𝗮𝗯𝗼𝘂𝘁 𝗔𝗜 𝗮𝗻𝗱 𝘀𝘂𝘀𝘁𝗮𝗶𝗻𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝗶𝘀 𝗼𝗻𝗲-𝘀𝗶𝗱𝗲𝗱. Let's fix this... Yes, Generative AI indeed consumes more energy than previous generations. But it misses an important part of the equation. Because AI can also drive significant sustainability benefits by optimizing energy use and reducing the carbon footprint. AI is pivotal in optimizing processes to use less energy while maintaining or improving outcomes. For instance, AI models can optimize industrial processes, heating and cooling systems, and data center operations, significantly reducing energy waste and emissions. Take the example of data centers: They contribute roughly 0.2% of the world’s energy consumption. Approximately 10% is attributed to AI workloads (i.e., 0.02% of the data center consumption). Even if the energy consumption for those AI workloads increases to 0.04% (or even all the way up to 0.5% by 2027), AI could be used to drive efficiency of the remaining 99.5-99.96% of global energy consumption and thereby drive a significantly greater impact than the power consumption required to achieve that result. AI also helps optimize the demand side of energy management. AI can help balance energy supply and demand more efficiently by forecasting energy production, consumption, and power quality. This optimization can reduce the need for carbon-heavy energy sources during peak demand times, further supporting the shift towards renewable energy. 𝗜𝘀 𝘁𝗵𝗲 𝗶𝗻𝗰𝗿𝗲𝗮𝘀𝗲 𝗶𝗻 𝗲𝗻𝗲𝗿𝗴𝘆 𝗰𝗼𝗻𝘀𝘂𝗺𝗽𝘁𝗶𝗼𝗻 𝗷𝘂𝘀𝘁𝗶𝗳𝗶𝗲𝗱? 𝗔𝗻𝗱 𝘄𝗶𝗹𝗹 𝘁𝗵𝗲 𝗽𝗼𝘁𝗲𝗻𝘁𝗶𝗮𝗹 𝗺𝗮𝘁𝗲𝗿𝗶𝗮𝗹𝗶𝘇𝗲? 𝘙𝘦𝘢𝘥 𝘵𝘩𝘦 𝘧𝘶𝘭𝘭 𝘢𝘳𝘵𝘪𝘤𝘭𝘦 𝘷𝘪𝘢 𝘵𝘩𝘦 𝘭𝘪𝘯𝘬 𝘣𝘦𝘭𝘰𝘸. #ArtificialIntelligence #GenerativeAI #Sustainability #IntelligenceBriefing

  • View profile for Jason Gulya

    Exploring the Connections Between GenAI, Alternative Assessment, and Process-Minded Teaching | Professor of English and Communications at Berkeley College | Keynote Speaker | Mentor for AAC&U’s AI Institute

    39,278 followers

    Imagine dumping out a bottle of water every time you sent an email with AI. (we need to talk about this) Recently, the The Washington Post published a story that should give us all pause. Companies like OpenAI have actively hidden how much energy and water their programs use. So, researchers have had to go above and beyond. They've used whatever data they have to go around these companies, to figure it out. This is the figure that stands out. **Every time we send an email with an AI chat it, it's the equivalent of spilling a 16oz bottle of water** That is way more than writing an email ourselves. That is an inconvenient fact for anyone using AI regularly (including myself). And what's worse, the energy and water consumption is going in the wrong direction. OpenAI's most recent model (thanks Richard Self) is taking up way more water and energy. ---------------------- For me, there appear to be 2 likely scenarios. 1️⃣ Things get worse and worse. ► Future models take up more and more energy and water. ► Maybe we just abandon them. ► Or we just accelerate a climate crisis. 2️⃣ This pushes companies to figure it out. ► Things get so bad that it pushes us to figure out more sustainable practices and more environmentally friendly models. ► This puts A LOT of faith in companies. I'm sure there are far more outcomes. I won't list them all. But right now, these seem like the most likely ones. --------------------- At the very least, this means we should: 1️⃣ Be very selective about when we work AI into the process. 2️⃣ Talk to our students about it. I just had a few intense conversations about it with my college students the other day. They had many insightful thoughts! What do you think? Are you concerned? Are you optimistic?

Explore categories