Every year, natural disasters hit harder and closer to home. But when city leaders ask, "How will rising heat or wildfire smoke impact my home in 5 years?"—our answers are often vague. Traditional climate models give sweeping predictions, but they fall short at the local level. It's like trying to navigate rush hour using a globe instead of a street map. That’s where generative AI comes in. This year, our team at Google Research built a new genAI method to project climate impacts—taking predictions from the size of a small state to the size of a small city. Our approach provides: - Unprecedented detail – in regional environmental risk assessments at a small fraction of the cost of existing techniques - Higher accuracy – reduced fine-scale errors by over 40% for critical weather variables and reduces error in extreme heat and precipitation projections by over 20% and 10% respectively - Better estimates of complex risks – Demonstrates remarkable skill in capturing complex environmental risks due to regional phenomena, such as wildfire risk from Santa Ana winds, which statistical methods often miss Dynamical-generative downscaling process works in two steps: 1) Physics-based first pass: First, a regional climate model downscales global Earth system data to an intermediate resolution (e.g., 50 km) – much cheaper computationally than going straight to very high resolution. 2) AI adds the fine details: Our AI-based Regional Residual Diffusion-based Downscaling model (“R2D2”) adds realistic, fine-scale details to bring it up to the target high resolution (typically less than 10 km), based on its training on high-resolution weather data. Why does this matter? Governments and utilities need these hyperlocal forecasts to prepare emergency response, invest in infrastructure, and protect vulnerable neighborhoods. And this is just one way AI is turbocharging climate resilience. Our teams at Google are already using AI to forecast floods, detect wildfires in real time, and help the UN respond faster after disasters. The next chapter of climate action means giving every city the tools to see—and shape—their own future. Congratulations Ignacio Lopez Gomez, Tyler Russell MBA, PMP, and teams on this important work! Discover the full details of this breakthrough: https://lnkd.in/g5u_WctW PNAS Paper: https://lnkd.in/gr7Acz25
Using existing data in climate predictions
Explore top LinkedIn content from expert professionals.
Summary
Using existing data in climate predictions means applying past and current weather and climate information—collected from satellites, sensors, and historical records—to forecast future climate conditions. This approach allows researchers and decision makers to create more accurate and detailed predictions about weather events, environmental risks, and long-term climate trends, especially with the help of artificial intelligence and advanced statistical techniques.
- Integrate historical records: Combine decades of weather data with modern climate models to train AI systems and improve local and global forecasts.
- Apply statistical analysis: Use new methods to balance, weigh, and merge different climate models for projections that better reflect real-world conditions.
- Empower rapid decision-making: Rely on AI-driven forecasts to deliver instant weather predictions, enabling faster responses to severe events and infrastructure planning.
-
-
Scientists Combine Climate Models For More Accurate Projections -- https://lnkd.in/ga-82Kaw <-- shared technical article -- https://lnkd.in/gHFTDAYj <-- shared paper -- Researchers... have created a new method for statistically analyzing climate models that projects future conditions with more fidelity. The method provides a way to adjust for models with high temperature sensitivities—a known problem in the community. By assigning different weights to models and combining them, the researchers estimate that the global temperature will increase between 2 and 5° Celsius by the end of the century. This projection, published in Nature Communications Earth & Environment [link above], aligns with previous projections, although this novel framework is more inclusive, avoiding the rejection of models that was common practice in previous methods... A key parameter for these models—known as equilibrium climate sensitivity or ECS—describes the relationship between change in carbon dioxide and corresponding warming. Although the Earth system has a true ECS, it is not a measurable quantity. Different lines of evidence can provide a plausible picture of the Earth's true ECS, which can alleviate the uncertainty of simulation models. However, many models assume a high ECS and predict higher temperatures in response to more atmospheric carbon dioxide than occurs in the real Earth system. Because these models provide estimates about future conditions to scientists and policymakers, it is important to ensure that they represent the conditions of the Earth as faithfully as possible. Previous methods mitigated this issue by eliminating models with a high ECS value. "That was a heavy-handed approach," said Massoud. "The models that were thrown out might have good information that we need, especially for understanding the extreme ends of things." "Instead, we adopted a tool called Bayesian Model Averaging, which is a way to combine models with varying influence when estimating their distribution," said Massoud. "We used this to constrain the ECS on these models, which enabled us to project future conditions without the 'hot model problem.'"... This new method provides a framework for how to best understand a collection of climate models. The model weights included in this research informed the Fifth National Climate Assessment, a report released on Nov. 14 that gauges the impacts of climate change in the United States. This project also supports the Earth System Grid Federation, an international collaboration led in the U.S. by DOE that manages and provides access to climate models and observed data…” #GIS #spatial #mapping #climatechange #spatialanalysis #spatiotemporal #model #modeling #numericmodeling #global #statistics #weighting #bayesian #modelaverging #climatesensivity #climatemodels #projection #ECS #earthsystem #ORNL
-
+1
-
𝗔𝗜 𝗳𝗼𝗿 𝗚𝗢𝗢𝗗: 𝗡𝗔𝗦𝗔 𝗮𝗻𝗱 𝗜𝗕𝗠 𝗹𝗮𝘂𝗻𝗰𝗵 𝗼𝗽𝗲𝗻-𝘀𝗼𝘂𝗿𝗰𝗲 𝗔𝗜 𝗳𝗼𝘂𝗻𝗱𝗮𝘁𝗶𝗼𝗻 𝗺𝗼𝗱𝗲𝗹 𝗳𝗼𝗿 𝗺𝗼𝗿𝗲 𝗲𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝘁 𝘄𝗲𝗮𝘁𝗵𝗲𝗿 𝗮𝗻𝗱 𝗰𝗹𝗶𝗺𝗮𝘁𝗲 𝗳𝗼𝗿𝗲𝗰𝗮𝘀𝘁𝗶𝗻𝗴! 🌍 (𝗧𝗵𝗶𝘀 𝗶𝘀 𝘄𝗵𝗮𝘁 𝘀𝗵𝗼𝘂𝗹𝗱 𝗴𝗲𝘁 𝗺𝗼𝗿𝗲 𝘀𝗽𝗼𝘁𝗹𝗶𝗴𝗵𝘁 𝗽𝗹𝗲𝗮𝘀𝗲 𝗮𝗻𝗱 𝗡𝗢𝗧 𝘁𝗵𝗲 𝗻𝗲𝘅𝘁 𝗖𝗵𝗮𝘁𝗚𝗣𝗧 𝗪𝗿𝗮𝗽𝗽𝗲𝗿!) In collaboration with NASA, IBM just launched Prithvi WxC an open-source, general-purpose AI model for weather and climate-related applications. And the truly remarkable part is that this model can run on a desktop computer. 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘆𝗼𝘂 𝗻𝗲𝗲𝗱 𝘁𝗼 𝗸𝗻𝗼𝘄: ⬇️ → The Prithvi WxC model (2.3-billion parameter) can create six-hour-ahead forecasts as a “zero-shot” skill – meaning it requires no tuning and runs on readily available data. → This AI model is designed to be customized for a variety of weather applications, from predicting local rainfall to tracking hurricanes or improving global climate simulations. → The model was trained using 40 years of NASA’s MERRA-2 data and can now be quickly tuned for specific use cases. And unlike traditional climate models that require massive supercomputers, this one operates on a desktop. Uniqueness lies in the ability to generalize from a small, high-quality sample of weather data to entire global forecasts. → This AI-powered model outperforms traditional numerical weather prediction methods in both accuracy and speed, producing global forecasts up to 10 days in advance within minutes instead of hours. → This model has immense potential for various applications, from downscaling high-resolution climate data to improving hurricane forecasts and capturing gravity waves. It could also help estimate the extent of past floods, forecast hurricanes, and infer the intensity of past wildfires from burn scars. It will be exciting to see what downstream apps, use cases, and potential applications emerge. What’s clear is that this AI foundation model joins a growing family of open-source tools designed to make NASA’s vast collection of satellite, geospatial, and Earth observational data faster and easier to analyze. With decades of observations, NASA holds a wealth of data, but its accessibility has been limited — until recently. This model is a big step toward democratizing data and making it more accessible to all. 𝗔𝗻𝗱 𝘁𝗵𝘀 𝗶𝘀 𝘆𝗲𝘁 𝗮𝗻𝗼𝘁𝗵𝗲𝗿 𝗽𝗿𝗼𝗼𝗳 𝘁𝗵𝗮𝘁 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗼𝗳 𝗔𝗜 𝗶𝘀 𝗼𝗽𝗲𝗻, 𝗱𝗲𝗰𝗲𝗻𝘁𝗿𝗮𝗹𝗶𝘇𝗲𝗱, 𝗮𝗻𝗱 𝗿𝘂𝗻𝗻𝗶𝗻𝗴 𝗮𝘁 𝘁𝗵𝗲 𝗲𝗱𝗴𝗲. 🌍 🔗 Resources: Download the models from the Hugging Face repository: https://lnkd.in/gp2zmkSq Blog post: https://ibm.co/3TDul9a Research paper: https://ibm.co/3TAILXG #AI #ClimateScience #WeatherForecasting #OpenSource #NASA #IBMResearch
-
AI Replaces Supercomputers in Weather Forecasting with Instant Predictions A Forecasting Revolution in One Second New research reveals that artificial intelligence can now deliver weather forecasts in seconds on a desktop—matching the accuracy of traditional models that require hours or days on supercomputers. This breakthrough marks a dramatic shift in meteorology, where the reliance on physics-based numerical weather prediction (NWP) models—unchanged in principle since the 1950s—is being replaced by AI-driven forecasting that is faster, cheaper, and far more energy-efficient. How AI is Disrupting Traditional Forecasting Historically, weather predictions depend on vast data inputs from satellites, balloons, and weather stations. These observations are fed into complex NWP models that simulate the atmosphere based on physical laws, requiring massive computational resources. • Heavy Computing Burden: Running these models demands high-performance supercomputers, consuming significant time, power, and budget. • AI as a Lightweight Alternative: The new AI model operates in a single second on desktop hardware, offering comparable forecast accuracy without the need for physics-based simulation. • Machine Learning Core: Rather than modeling physical processes, the AI learns directly from decades of historical data to detect patterns and predict atmospheric conditions. From Patchwork Improvements to Full Replacement The journey toward full AI-driven forecasting began with hybrid models. Google researchers developed AI tools that could optimize select components of traditional systems, reducing computational loads. DeepMind, Google’s AI subsidiary, went further by creating graph-based models that completely replaced the forecasting process. • European Adoption: The European Centre for Medium-Range Weather Forecasts (ECMWF) has already begun using AI-based systems, marking one of the first institutional adoptions of this new approach. • ForecastNet and GraphCast: These AI models use neural networks trained on historical weather data to predict temperature, pressure, precipitation, and wind with high spatial and temporal resolution. Why It Matters AI’s success in weather forecasting is not just a technological achievement—it’s a paradigm shift with implications for science, society, and sustainability. • Energy Efficiency: AI reduces the carbon footprint of weather forecasting by orders of magnitude—critical in an era of climate awareness. • Faster Response for Emergencies: Rapid forecasts can assist governments and aid agencies in responding to severe weather, such as hurricanes, wildfires, and floods, in near real-time. This milestone signals a profound transformation in how we understand and anticipate weather. By turning historical data into actionable predictions in seconds, AI is not just optimizing forecasts—it is rewriting the very foundations of meteorology.