Improving global climate data representation

Explore top LinkedIn content from expert professionals.

Summary

Improving global climate data representation means making climate information more accurate, detailed, and accessible so scientists, cities, and countries can better understand and respond to climate risks. This involves using new technology, data collection methods, and artificial intelligence to fill gaps in global observations and create high-resolution forecasts that help communities prepare for weather events and long-term climate changes.

  • Increase data coverage: Support efforts to gather more climate and weather observations in under-resourced regions, as this helps make forecasts more reliable worldwide.
  • Adopt advanced models: Take advantage of new AI-powered and high-resolution climate models to gain clearer, location-specific forecasts that can inform local policies and emergency planning.
  • Promote data accessibility: Encourage the use of open-source climate and weather data tools, making accurate information available to researchers, city planners, and the public for better decision-making.
Summarized by AI based on LinkedIn member posts
  • View profile for Celeste Saulo
    Celeste Saulo Celeste Saulo is an Influencer

    Secretary-General in World Meteorological Organization

    27,116 followers

    🌐 Accurate weather forecasts save lives, protect economies, and enhance community resilience. More data means better forecasts. Yet many regions remain “blind spots.”  The Systematic Observations Financing Facility (SOFF) seeks to close these gaps in the global observation system. I'm excited that new impact experiments carried out by European Centre for Medium-Range Weather Forecasts - ECMWF show that investing in basic weather and climate observations in under-resourced countries improves the accuracy of weather forecasts both locally and globally.   ✅ Africa sees the greatest benefits: Forecast uncertainty decreases by more than 30 percent over Africa with new investments.  ✅ Pacific Islands matter: Forecasts uncertainty decreases by up to 20 percent in the Pacific region.  ✅ Upper-air data is crucial: Radiosonde (weather balloon) data has an outsized impact, especially in the tropics.  ✅ Local investment, global impact: While local improvements are observed over short timeframes (12 hours), forecast improvements extend beyond borders, benefiting people around the world.  More details: 📎 https://lnkd.in/dN-x3-jd Florence Rabier Florian Pappenberger Thomas Asare Markus Repnik World Meteorological Organization

  • Every year, natural disasters hit harder and closer to home. But when city leaders ask, "How will rising heat or wildfire smoke impact my home in 5 years?"—our answers are often vague. Traditional climate models give sweeping predictions, but they fall short at the local level. It's like trying to navigate rush hour using a globe instead of a street map. That’s where generative AI comes in. This year, our team at Google Research built a new genAI method to project climate impacts—taking predictions from the size of a small state to the size of a small city. Our approach provides: - Unprecedented detail – in regional environmental risk assessments at a small fraction of the cost of existing techniques - Higher accuracy – reduced fine-scale errors by over 40% for critical weather variables and reduces error in extreme heat and precipitation projections by over 20% and 10% respectively - Better estimates of complex risks – Demonstrates remarkable skill in capturing complex environmental risks due to regional phenomena, such as wildfire risk from Santa Ana winds, which statistical methods often miss Dynamical-generative downscaling process works in two steps: 1) Physics-based first pass: First, a regional climate model downscales global Earth system data to an intermediate resolution (e.g., 50 km) – much cheaper computationally than going straight to very high resolution. 2) AI adds the fine details: Our AI-based Regional Residual Diffusion-based Downscaling model (“R2D2”) adds realistic, fine-scale details to bring it up to the target high resolution (typically less than 10 km), based on its training on high-resolution weather data. Why does this matter? Governments and utilities need these hyperlocal forecasts to prepare emergency response, invest in infrastructure, and protect vulnerable neighborhoods. And this is just one way AI is turbocharging climate resilience. Our teams at Google are already using AI to forecast floods, detect wildfires in real time, and help the UN respond faster after disasters. The next chapter of climate action means giving every city the tools to see—and shape—their own future. Congratulations Ignacio Lopez Gomez, Tyler Russell MBA, PMP, and teams on this important work! Discover the full details of this breakthrough: https://lnkd.in/g5u_WctW  PNAS Paper: https://lnkd.in/gr7Acz25

  • View profile for Steve Rosenbush

    Bureau Chief, Enterprise Technology at The Wall Street Journal Leadership Institute

    7,003 followers

    In this week's column, I look at NVIDIA's new generative foundation model that it says enables simulations of Earth’s global climate with an unprecedented level of resolution. As is so often the case with powerful new technology, however, the question is what else humans will do with it. The company expects that climate researchers will build on top of its new AI-powered model to make climate predictions that focus on five-kilometer areas. Previous leading-edge global climate models typically don’t drill below 25 to 100 kilometers. Researchers using the new model may be able to predict conditions decades into the future with a new level of precision, providing information that could help efforts to mitigate climate change or its effects. A 5-kilometer resolution may help capture vertical movements of air in the lower atmosphere that can lead to certain kinds of thunderstorms, for example, and that might be missed with other models. And to the extent that high-resolution near-term forecasts are more accurate, the accuracy of longer-term climate forecasts will improve in turn, because the accuracy of such predictions compounds over time. The model, branded by Nvidia as cBottle for “Climate in a Bottle,” compresses the scale of Earth observation data 3,000 times and transforms it into ultra-high-resolution, queryable and interactive climate simulations, according to Dion Harris, senior director of high-performance computing and AI factory solutions at Nvidia. It was trained on high-resolution physical climate simulations and estimates of observed atmospheric states over the past 50 years. It will take years, of course, to know just how accurate the model’s long-term predictions turn out to be. The The Alan Turing Institute of AI and the Max Planck Institute of Meteorology, are actively exploring the new model, Nvidia said Tuesday at the ISC 2025 computing conference in Hamburg. Bjorn Stevens, director of the Planck Institute, said it “represents a transformative leap in our ability to understand, predict and adapt to the world around us.” The Earth-2 platform is in various states of deployment at weather agencies from NOAA: National Oceanic & Atmospheric Administration in the U.S. to G42, an Abu Dhabi-based holding company focused on AI, and the National Science and Technology Center for Disaster Reduction in Taiwan. Spire Global, a provider of data analytics in areas such as climate and global security, has used Earth-2 to help improve its weather forecasts by three orders of magnitude with regards to speed and cost over the last three or four years, according to Peter Platzer, co-founder and executive chairman.

  • View profile for Andreas Horn

    Head of AIOps @ IBM || Speaker | Lecturer | Advisor

    219,276 followers

    𝗔𝗜 𝗳𝗼𝗿 𝗚𝗢𝗢𝗗: 𝗡𝗔𝗦𝗔 𝗮𝗻𝗱 𝗜𝗕𝗠 𝗹𝗮𝘂𝗻𝗰𝗵 𝗼𝗽𝗲𝗻-𝘀𝗼𝘂𝗿𝗰𝗲 𝗔𝗜 𝗳𝗼𝘂𝗻𝗱𝗮𝘁𝗶𝗼𝗻 𝗺𝗼𝗱𝗲𝗹 𝗳𝗼𝗿 𝗺𝗼𝗿𝗲 𝗲𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝘁 𝘄𝗲𝗮𝘁𝗵𝗲𝗿 𝗮𝗻𝗱 𝗰𝗹𝗶𝗺𝗮𝘁𝗲 𝗳𝗼𝗿𝗲𝗰𝗮𝘀𝘁𝗶𝗻𝗴! 🌍 (𝗧𝗵𝗶𝘀 𝗶𝘀 𝘄𝗵𝗮𝘁 𝘀𝗵𝗼𝘂𝗹𝗱 𝗴𝗲𝘁 𝗺𝗼𝗿𝗲 𝘀𝗽𝗼𝘁𝗹𝗶𝗴𝗵𝘁 𝗽𝗹𝗲𝗮𝘀𝗲 𝗮𝗻𝗱 𝗡𝗢𝗧 𝘁𝗵𝗲 𝗻𝗲𝘅𝘁 𝗖𝗵𝗮𝘁𝗚𝗣𝗧 𝗪𝗿𝗮𝗽𝗽𝗲𝗿!) In collaboration with NASA, IBM just launched Prithvi WxC an open-source, general-purpose AI model for weather and climate-related applications. And the truly remarkable part is that this model can run on a desktop computer. 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘆𝗼𝘂 𝗻𝗲𝗲𝗱 𝘁𝗼 𝗸𝗻𝗼𝘄: ⬇️ → The Prithvi WxC model (2.3-billion parameter) can create six-hour-ahead forecasts as a “zero-shot” skill – meaning it requires no tuning and runs on readily available data. → This AI model is designed to be customized for a variety of weather applications, from predicting local rainfall to tracking hurricanes or improving global climate simulations. → The model was trained using 40 years of NASA’s MERRA-2 data and can now be quickly tuned for specific use cases. And unlike traditional climate models that require massive supercomputers, this one operates on a desktop. Uniqueness lies in the ability to generalize from a small, high-quality sample of weather data to entire global forecasts. → This AI-powered model outperforms traditional numerical weather prediction methods in both accuracy and speed, producing global forecasts up to 10 days in advance within minutes instead of hours. → This model has immense potential for various applications, from downscaling high-resolution climate data to improving hurricane forecasts and capturing gravity waves. It could also help estimate the extent of past floods, forecast hurricanes, and infer the intensity of past wildfires from burn scars. It will be exciting to see what downstream apps, use cases, and potential applications emerge. What’s clear is that this AI foundation model joins a growing family of open-source tools designed to make NASA’s vast collection of satellite, geospatial, and Earth observational data faster and easier to analyze. With decades of observations, NASA holds a wealth of data, but its accessibility has been limited — until recently. This model is a big step toward democratizing data and making it more accessible to all. 𝗔𝗻𝗱 𝘁𝗵𝘀 𝗶𝘀 𝘆𝗲𝘁 𝗮𝗻𝗼𝘁𝗵𝗲𝗿 𝗽𝗿𝗼𝗼𝗳 𝘁𝗵𝗮𝘁 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗼𝗳 𝗔𝗜 𝗶𝘀 𝗼𝗽𝗲𝗻, 𝗱𝗲𝗰𝗲𝗻𝘁𝗿𝗮𝗹𝗶𝘇𝗲𝗱, 𝗮𝗻𝗱 𝗿𝘂𝗻𝗻𝗶𝗻𝗴 𝗮𝘁 𝘁𝗵𝗲 𝗲𝗱𝗴𝗲. 🌍 🔗 Resources: Download the models from the Hugging Face repository: https://lnkd.in/gp2zmkSq Blog post: https://ibm.co/3TDul9a Research paper: https://ibm.co/3TAILXG #AI #ClimateScience #WeatherForecasting #OpenSource #NASA #IBMResearch

  • View profile for Andreas Rasche

    Professor and Associate Dean at Copenhagen Business School I focused on ESG and corporate sustainability

    63,868 followers

    The future of climate projections... The EU's "Destination Earth" initiative created a Digital Twin (DT) of the Earth system. This DT is a kind of simulated "living" replica of the Earth system and was designed to provide more fine-grained climate projections. Powered by the first pre-exascale supercomputers in Europe, the Climate DT is able to provide climate impact data at scales of a few kilometres (current scale is around 100km, see image). Such local granularity matters, as climate change is a global but also very local phenomenon. The new Climate DT can bridge the gap between global (rather large-scale) climate projections and local climate impacts. Hopefully, this will support policy-making on climate adaptation and mitigation with a regional focus in mind. More info on the Climate DT: https://lnkd.in/dG3YV_kA Academic paper on Destination Earth: https://lnkd.in/dj_gjRNW #climatechange, #sustainability

Explore categories