Impact of large language models on climate action

Explore top LinkedIn content from expert professionals.

Summary

Large language models (LLMs) are advanced artificial intelligence systems that process enormous amounts of data and require substantial energy, resulting in significant carbon emissions and resource consumption. The impact of large language models on climate action refers to the environmental consequences of their development and use, as well as the push for more sustainable AI practices.

  • Choose wisely: Use smaller, task-specific AI models whenever possible to minimize energy use and reduce your carbon footprint.
  • Streamline inputs: Keep prompts and responses concise to cut down on energy consumption during model operation.
  • Support transparency: Encourage organizations to publish real-time metrics on the energy and water usage of their AI models to promote accountability and sustainability.
Summarized by AI based on LinkedIn member posts
  • View profile for Marinela Profi

    Global AI and GenAI Market Strategy Lead | MBA | MS in AI | Data Scientist | Public Speaker | Advisory Board Member | LLMs alone don’t solve business problems

    9,201 followers

    Today is Earth Day. 🌏  As half of the world (myself included) is focused on how to leverage Generative AI to reinvent business processes - and the other half dreams of conquering the space - I want to take a moment to raise awareness on the “dark side” of Generative AI. Generating one AI image can emit as much CO₂ as charging your phone - and that’s just inference. Training LLMs can take weeks on GPU clusters, consuming massive energy and water resources for cooling.  As shown in the chart below, image generation is the most energy-intensive task, followed by summarization and image captioning. The environmental toll includes: - Enormous energy usage ⚡️ - High carbon emissions 💨 - Unsustainable water consumption 💧 Are we competing with AI for life resources? 🤖 While Sam Altman suggests us to be polite with LLMs despite the huge energy consumption (cause "you never know"🤨), a sustainable AI is not a dream. And it should be a priority. for both individuals and organizations. There are energy efficient practices that are being researched and are proving to work quite well. I spent a long time educating myself on the topic (the chart below is from a presentation I delivered at the AI World Summit in NYC in December 2023) - and I am sharing what I consider to be the best papers to learn more about this critical topic. 1. Power Hungry Processing: Watts Driving the Cost of AI Deployment? https://lnkd.in/d7BHrcyV ( this is the source of the chart) 2. A Survey of Sustainability in Large Language Models: Applications, Economics, and Challenges https://lnkd.in/dCwaNdAR  3. From Words to Watts: Benchmarking the Energy Costs of Large Language Model Inference https://lnkd.in/d6BNDSKU  4. Double jeopardy and climate impact in the use of large language models: Socio-economic disparities and reduced utility for non-english speakers: https://lnkd.in/dU8CRn6J In the work I am doing at SAS with AI, I am grateful that our approach to Generative AI was fueled since day 1 by key questions like:"Is it REALLY needed?" "Who is this going to help or harm?" "Does the world need another LLM or do we need to help organizations understand how to best and most effectively leverage the existing ones? The answers were clear, and I am proud to work for a company that is approaching generative AI with a value-first approach and delivering real world value responsibly. Let’s make Earth Day not just a reminder, but a call to build AI responsibly. 🌏 #EarthDay #SustainableAI #GenerativeAI #LLM #AIEthics #GreenTech #AgenticAI #SASViya

  • View profile for Alessandro Romei

    C-Level Executive | Strategic CEO & Board Member | Driving Sustainable Growth & Global Expansion | Expert in Market Development & Business Transformation | Inspiring Leader of Multicultural, High-Performing Teams

    31,441 followers

    🌍 The Carbon Emissions of Training AI Models 🌍 Training large language models, such as Meta's Llama 3 (70B) and OpenAI's GPT-3 (175B), generates significant CO2 emissions 🌫️. For instance, GPT-3's training emissions amount to 502 metric tons of CO2 equivalent (tCO2eq), which is over 500 times higher than a single passenger flight from New York to San Francisco. Meanwhile, Llama 3's emissions are a staggering 1,900 tCO2eq, 30 times greater than the lifetime emissions of an average car 🚗. Released in 2024, Llama 3's training emissions are almost four times those of GPT-3, highlighting the increasing environmental costs of developing more advanced AI models 📈. This growing computational demand underscores the urgent need for sustainable practices in technology. Unlike many other companies, Meta offsets the emissions produced by training their models 🌱. Energy-intensive AI training contributes significantly to global CO2 emissions. As we innovate, it’s crucial to consider the environmental impact and work towards greener solutions 🌿. Today, the ICT sector accounts for around 2% of global CO2 emissions, with AI training contributing a substantial share. The future of AI development must prioritize sustainability. With around 750 million people lacking access to electricity and billions relying on polluting biomass fuels, it's essential to balance technological progress with environmental stewardship 🌍. As we push the boundaries of AI, let’s also push for a sustainable future! #AI #Sustainability #ClimateAction #GreenTech #EnvironmentalImpact #TechForGood #ClimateChange #EcoFriendly #CarbonFootprint #Innovation #FutureTech Source: Visual Capitalist & arXiv & Hugging Face

  • View profile for Kartik Hosanagar

    AI, Entrepreneurship, Mindfulness. Wharton professor. Cofounder Yodle, Jumpcut

    20,137 followers

    Over the next three years, the U.S. will need the equivalent of three New York City's worth of energy to support AI. As someone who cares about climate change, how can you reduce the environmental impact of your LLM use? Energy use by LLMs is impacted by model Size (number of parameters), the length of your input and the model’s output (input and output tokens), and model optimizations like pruning seen in newer models. Here’s how to reduce energy usage: - For text summarization or quick Information Retrieval, smaller models like GPT-4o Mini or GPT-o1 Mini are sufficient. These tasks don't require the full power of larger models. - Creative Writing or Complex Analysis: For tasks requiring nuance, opt for GPT-4o. However, consider whether splitting the task into smaller, simpler components might allow you to use a smaller model.  - Testing and Experimentation: If you’re experimenting, start with a smaller model (GPT-4o Mini or GPT-o1 Mini). Upgrade if the results are insufficient. - For developers accessing models through the API, smaller models are not only more energy-efficient but also more cost-effective. Start with those (or break up your task into components and mix and match models of different sizes). More details on LLM use for the climate conscious here: https://lnkd.in/eT-HHk8T

  • View profile for Chris Stokel-Walker

    Journalist and communicator specialising in tech, AI, and digital culture (including YouTube and TikTok)

    4,757 followers

    Pick your choice: compute power or planet. New research I covered for Fast Company reveals that squeezing a few extra percentage points of accuracy from large-language models can multiply their carbon footprint by 70x. The study tested 14 popular models and showed wild differences in emissions. Companies ought to publish real-time energy and water metrics, and we as users should match the tool to the task: don’t summon a giant model to check a date in history. But as I explained in the video, I think bad branding (*cough* OpenAI) plays a role... https://lnkd.in/e5fvRUQi #AI #Sustainability #GreenAI #ResponsibleAI #ClimateTech #LLM #TechForGood #CarbonFootprint #EnergyEfficiency

  • View profile for Adam Savitz
    Adam Savitz Adam Savitz is an Influencer

    Global Sustainability Leader & Senior Advisor

    8,181 followers

    "Generative AI’s annual energy footprint is already equivalent to that of a low-income country, and it is growing exponentially. To make AI more sustainable, we need a paradigm shift in how we use it, and we must educate consumers about what they can do to reduce their environmental impact." - Tawfik Jelassi - Director-General for Communication and Information, UNESCO The research into the environmental impact of AI continues... New research published by UNESCO and UCL, shows that small changes to how #LargeLanguageModels are built and used can potentially dramatically reduce #energyconsumption without compromising performance. The report advocates for a pivot away from resource-heavy #AI models in favour of more compact models. Used together, these measures could reduce #energy consumption by up to 90%. 1️⃣ Smaller models are just as smart and accurate as large ones: Small models tailored to specific tasks can cut energy use by up to 90% 2️⃣ Shorter, more concise prompts and responses can reduce energy use by over 50% 3️⃣ Model-compression can save up to 44% in energy Read the report: https://lnkd.in/gSAmw4gr Smarter, smaller, stronger: resource-efficient generative Al & the future of digital transformation #LLMs #GenAI #sustainability #climateaction

  • View profile for Navveen Balani
    Navveen Balani Navveen Balani is an Influencer

    LinkedIn Top Voice | Google Cloud Fellow | Chair - Standards Working Group @ Green Software Foundation | Driving Sustainable AI Innovation & Specification | Award-winning Author | Let's Build a Responsible Future

    11,681 followers

    🌍💡 Green AI – The First Step Toward Awareness and Action 💡🌍 Artificial Intelligence (AI) holds immense potential to solve global challenges, but we cannot overlook its environmental impact. As AI models grow larger—especially Large Language Models (LLMs) and Generative AI systems—their energy demands and carbon emissions scale significantly. Green AI focuses on addressing this by reducing the carbon footprint and energy consumption of AI development and deployment. To raise awareness and encourage action, I’ve created a Sustainable AI repository 👉 (https://lnkd.in/dWBRfEEu) . This open community resource is designed to help AI practitioners minimize the environmental footprint of their AI workloads. 📊 Features of the Repo: 1. 🔧 Tools to monitor and reduce energy consumption and carbon emissions. 2. 📚 Research papers on the latest developments in Green AI (2022-2024), including the impact of LLMs and Generative AI. 3. 📐 References to methodologies and standards that guide sustainable AI practices, such as energy-efficient model training, carbon-aware software development, and tools for measuring the environmental impact of AI workloads. This is just the first step. The real power of Green AI lies in community collaboration. You are invited to contribute, share your ideas, and help build a sustainable AI future together. Feel free to create a pull request and join this open community. 🌱 #GreenAI #SustainableAI #AIforGood #Sustainability #LLMs #GenerativeAI #CarbonAwareness #AI #TechForGood #ClimateAction #OpenCommunity #GreenSoftware

  • View profile for Angel Hsu, PhD

    Associate Professor at University of North Carolina at Chapel Hill

    4,375 followers

    🌍 White paper alert: Check out my white paper written for the Anwar Gargash Diplomatic Academy, "How Artificial Intelligence Can Accelerate Global Climate Action." https://lnkd.in/eXXucw_X Nearly a year after COP28 in Dubai marked the conclusion of the Paris Agreement's First Global Stocktake, a key challenge emerged: managing the vast and varied data sources that required consolidation and analysis. I was asked to assess the potential of AI in tackling this complexity—specifically in integrating diverse types of climate data and information, spanning from earth observations and physical climate metrics to policy documents, sociodemographic insights, and individual-level data. Through three case applications (although there are many many more, check out climatechange.ai for a great wiki cataloguing AI-climate applications.) Some key findings: 🌍 AI has the power to fill in crucial data gaps that slow down climate action, especially for non-state and subnational actors. These groups play key roles but often go underreported. With AI-driven tools for tracking, analysis, and policy evaluation, we can better integrate their contributions and push forward the goals of the Paris Agreement. 📊 Enhancing Emissions Tracking: Machine learning (ML) is a game-changer for emissions tracking, particularly in challenging areas like land use and urban emissions. Advanced data integration can bring greater accuracy to GHG measurements, and predictive models can even forecast future emissions to support international transparency standards. 🔍 🌧️ AI for Risk Assessment & Adaptation: From flood risks to urban resilience, AI is proving invaluable in risk analysis. Tools like computer vision and NLP track and evaluate adaptation efforts, helping us anticipate and manage climate risks with greater precision. ⚠️ Challenges Remain: Despite AI's immense potential, we face hurdles like transparency, bias, and the high energy use of AI models. I stress the need for human-centered design, diverse data sources, and clear protocols to ensure AI is used fairly, ethically, and sustainably. Looking forward to hearing your thoughts! #climateaction #cop29 #AI #NLP #machinelearning #earthobservation #globalstocktake

Explore categories