Building Robust Climate Models

Explore top LinkedIn content from expert professionals.

Summary

Building robust climate models means creating computer simulations that can accurately predict how weather and climate patterns will change over time, even in the face of uncertainty and complex variables. Recent advances use artificial intelligence with traditional physics-based methods to produce higher-resolution, faster, and more reliable forecasts for both global and regional scenarios.

  • Combine approaches: Use AI alongside physics-based climate models to increase detail and reliability in predictions, especially for extreme events.
  • Design for scale: Choose modeling architectures that match the Earth's geometry and allow for long-term projections without distortions or errors.
  • Prioritize efficiency: Adopt machine learning tools that can quickly create high-resolution regional climate data, saving significant time and computing resources.
Summarized by AI based on LinkedIn member posts
  • View profile for Gopal Erinjippurath

    AI builder 🌎 | CTO and founder | data+space angel

    8,147 followers

    Climate models have long struggled with coarse resolution, limiting precise climate risk insights. But AI-driven methods are now changing this, unlocking more detailed intelligence than traditional physics-based approaches. I recently spoke with a research scientist at Google Research who highlighted a promising new hybrid approach. This method combines physics-based General Circulation Models (GCMs) with AI refinement, significantly improving resolution. The process starts with Regional Climate Models (RCMs) anchoring physical consistency at ~45 km resolution. Then, it uses a diffusion model, R2-D2, to enhance output resolution to 9 km, making estimates more suitable for projecting extreme climate events. 🔥 About R2-D2 R2‑D2 (Regional Residual Diffusion-based Downscaling) is a diffusion model trained on residuals between RCM outputs and high-resolution targets. Conditioned on physical inputs like coarse climate fields and terrain, it rapidly generates high-res climate maps (~800 fields/hour on GPUs), complete with uncertainty estimates. ✅ Why this matters - Offers detailed projections of extreme climate events for precise risk quantification. - Delivers probabilistic forecasts, improving risk modeling and scenario planning. - Provides another high-resolution modeling approach, enriching ensemble strategies for climate risk projections. 👉 Read the full paper: https://lnkd.in/gU6qmZTR 👉 An excellent explainer blog: https://lnkd.in/gAEJFEV2 If your work involves climate risk assessment, adaptation planning, or quantitative modeling, how are you leveraging high-resolution risk projections?

  • View profile for Anima Anandkumar
    Anima Anandkumar Anima Anandkumar is an Influencer
    220,730 followers

    Further progress in AI+climate modeling "Applying the ACE2 Emulator to SST Green's Functions for the E3SMv3 Global Atmosphere Model". Building on ACE2 model which uses our spherical Fourier neural operator (SFNO) architecture, this work shows that ACE2 can replicate climate model responses to sea surface temperature perturbations with high fidelity at a fraction of the cost. This accelerates climate sensitivity research and helps us better understand radiative feedbacks in the Earth system. Background: The SFNO architecture was first used in training FourCastNet weather model, whose latest version (v3) has state-of-art probabilistic calibration. AI+Science is not just about blindly applying the standard transformer/CNN "hammer". It is about carefully designing neural architectures that incorporate domain constraints like geometry and multiple scales, while being expressive and easy to train. SFNO accomplishes both: it incorporates multiple scales, and it respects the spherical geometry and this is critical for success in climate modeling. Unlike short-term weather, which requires only a few autoregressive steps for rollout, climate modeling requires long rollouts with thousands or even greater number of time steps. All other AI-based models fail for long-term climate modeling including Pangu and GraphCast which ignore the spherical geometry. Distortions start building up at the poles since the models assume domain is a rectangle, and they lead to catastrophic failures. Structure matters in AI+Science!

  • View profile for Jozef Pecho

    Climate/NWP Model & Data Analyst at Floodar (Meratch), GOSPACE LABS | Predicting floods, protecting lives

    1,617 followers

    🌍 Climate scientists often face a trade-off: Global Climate Models (GCMs) are essential for long-term climate projections — but they operate at coarse spatial resolution, making them too crude for regional or local decision-making. To get fine-scale data, researchers use Regional Climate Models (RCMs). These add crucial spatial detail, but come at a very high computational cost, often requiring supercomputers to run for months. ➡️ A new paper introduces EnScale — a machine learning framework that offers an efficient and accurate alternative to running full RCM simulations. Instead of solving the complex physics from scratch, EnScale "learns" the relationship between GCMs and RCMs by training on existing paired datasets. It then generates high-resolution, realistic, and diverse regional climate fields directly from GCM inputs. What makes EnScale stand out? ✅ It uses a generative ML model trained with a statistically principled loss (energy score), enabling probabilistic outputs that reflect natural variability and uncertainty ✅ It is multivariate – it learns to generate temperature, precipitation, radiation, and wind jointly, preserving spatial and cross-variable coherence ✅ It is computationally lightweight – training and inference are up to 10–20× faster than state-of-the-art generative approaches ✅ It includes an extension (EnScale-t) for generating temporally consistent time series – a must for studying events like heatwaves or prolonged droughts This approach opens the door to faster, more flexible generation of regional climate scenarios, essential for risk assessment, infrastructure planning, and climate adaptation — especially where computational resources are limited. 📄 Read the full paper: EnScale: Temporally-consistent multivariate generative downscaling via proper scoring rules ---> https://lnkd.in/dQr5rmWU (code: https://lnkd.in/dQk_Jv8g) 👏 Congrats to the authors — a strong step forward for ML-based climate modeling! #climateAI #downscaling #generativeAI #machinelearning #climatescience #EnScale #RCM #GCM #ETHZurich #climatescenarios

Explore categories