Using Technology in Scientific Research

Explore top LinkedIn content from expert professionals.

  • View profile for Matt Forrest
    Matt Forrest Matt Forrest is an Influencer

    🌎 Helping geospatial professionals grow using technology · Scaling geospatial at Wherobots

    71,606 followers

    ☁️ Clouds block satellites. Floods don’t care. Here’s how foundation models are being adapted to see through the storm. During extreme weather events like floods, clouds often block optical satellites from capturing usable data. And that's exactly when timely insight matters most. Originally shared by Heather Couture, PhD, this study tackled this head-on. It adapted the Prithvi foundation model, originally trained on optical imagery, by incorporating Synthetic Aperture Radar (SAR) to detect floods across the UK and Ireland. ✅ SAR can “see” through clouds ✅ Fine-tuning the model with SAR bands boosted flood segmentation accuracy from 0.58 to 0.79 ✅ Even small amounts of local data were enough to adapt the model to new regions This research shows that Earth Observation Foundation Models can be effectively adapted for disaster response, even in data scarce areas and how AI can be useful for real world problems. 🌎 I'm Matt and I talk about modern GIS, AI, and how geospatial is changing. 📬 Want more like this? Join 6k+ others learning from my newsletter → forrest.nyc

  • View profile for Anima Anandkumar
    Anima Anandkumar Anima Anandkumar is an Influencer
    220,729 followers

    Further progress in AI+climate modeling "Applying the ACE2 Emulator to SST Green's Functions for the E3SMv3 Global Atmosphere Model". Building on ACE2 model which uses our spherical Fourier neural operator (SFNO) architecture, this work shows that ACE2 can replicate climate model responses to sea surface temperature perturbations with high fidelity at a fraction of the cost. This accelerates climate sensitivity research and helps us better understand radiative feedbacks in the Earth system. Background: The SFNO architecture was first used in training FourCastNet weather model, whose latest version (v3) has state-of-art probabilistic calibration. AI+Science is not just about blindly applying the standard transformer/CNN "hammer". It is about carefully designing neural architectures that incorporate domain constraints like geometry and multiple scales, while being expressive and easy to train. SFNO accomplishes both: it incorporates multiple scales, and it respects the spherical geometry and this is critical for success in climate modeling. Unlike short-term weather, which requires only a few autoregressive steps for rollout, climate modeling requires long rollouts with thousands or even greater number of time steps. All other AI-based models fail for long-term climate modeling including Pangu and GraphCast which ignore the spherical geometry. Distortions start building up at the poles since the models assume domain is a rectangle, and they lead to catastrophic failures. Structure matters in AI+Science!

  • View profile for Debbie W.
    Debbie W. Debbie W. is an Influencer

    President of Google in Europe, the Middle East, and Africa. Helping people across EMEA achieve their ambitions, big and small, through high impact technology.

    46,018 followers

    We know the Earth is getting warmer, but not what it means specifically for different regions. To figure this out, scientists do climate modelling. 🔎 🌍 , Google Research has published groundbreaking advancements in climate prediction using the power of #AI! Typically, researchers use "climate modelling" to understand the regional impacts of climate change, but current approaches have large uncertainty. Introducing NeuralGCM: a new atmospheric model that outperforms existing models by combining AI with physics-based modelling for improved accuracy and efficiency. Here’s why it stands out: ✅ More Accurate Simulations When predicting global temperatures and humidity for 2020, NeuralGCM had 15-50% less error than the state-of-the-art model "X-SHiELD". ✅ Faster Results NeuralGCM is 3,500 times quicker than X-SHiELD. If researchers simulated a year of the Earth's atmosphere with X-SHiELD, it would take 20 days to complete —  whereas NeuralGCM achieves this in just 8 minutes. ✅ Greater Accessibility Google Research has made NeuralGCM openly available on GitHub for non-commercial use, allowing researchers to explore, test ideas, and improve the model’s functionality. The research showcases AI’s ability to help deliver more accurate, efficient, and accessible climate predictions, which is critical to navigating a changing global climate. Read more about the team’s groundbreaking research in   Nature Portfolio’s  latest article! → https://lnkd.in/e-Etb_x4 #AIforClimateAction #Sustainability #AI

  • View profile for Charu Adesnik

    Executive Director, Cisco Foundation | Director, Social Impact and Innovation Investments, Cisco Systems Inc.

    4,534 followers

    Digital skills aren’t just powering businesses. They are strengthening climate resilience. In the face of rising temperatures and extreme weather, small holder farmers need better tools to adapt and plan. But many regions still lack the local data needed to inform decisions about planting, harvesting, and protecting crops. That is why I am inspired by the work of Cisco Foundation grantee One Acre Fund, which is using remote sensing technology to support precision agriculture. Their approach includes flood mapping, digital weather advisories, and crop yield monitoring, helping farmers respond to climate risks with better information. In Kenya, more than 10,000 farmers now receive weather updates. Thousands of flood data points have been mapped. And pilots are already underway in four additional countries. When we pair digital tools with community-driven solutions, we unlock powerful potential for impact. #DigitalSkills #ClimateResilience #TechForGood

  • View profile for Matt Brittin
    Matt Brittin Matt Brittin is an Influencer

    ex-President of Google EMEA. Gap Year Student, part time athlete. Tech for Good.

    57,013 followers

    AI has the potential to bring new waves of innovation, social and economic progress on a scale we’ve not seen before - including supercharging scientific progress. This week, Google published NeuralGCM: an openly available tool for fast, accurate climate modelling - critical to a changing global climate. We know that the Earth is getting warmer, but it’s hard to predict what that means for each different region. To figure this out, scientists use climate modelling. But current approaches have large uncertainty, including systematic errors - like forecasting extreme rain that is only half as intense as what scientists actually observe. That’s where NeuralGCM comes in. It combines physics-based modelling and AI to simulate the Earth’s atmosphere - making it faster and more accurate than existing climate models. For scientists exploring how to build better weather and climate models, it should make a huge difference in helping them understand the effects of the climate crisis on our world - and it could also be great for meteorologists making predictions about our daily weather! Interested in learning more? Read all about it here and watch the video below ⬇️ https://lnkd.in/e_bCuAhq 

  • View profile for Kendra Vant
    Kendra Vant Kendra Vant is an Influencer

    Turning AI ambitions into profitable products | Fractional AI Product Leader | ex-Xero | MIT PhD

    6,670 followers

    It doesn't take much reflection to realise that accurately predicting the weather is an awesomely complex problem. The state of the art systems today still rely heavily on numerical solvers which make the pipelines complex, slow and very computationally expensive to run. This means that while well-resourced countries can afford to produce high-resolution regional models there are significant limitations in places like West African and parts of the Pacific. So it's really exciting to read about Aardvark Weather, the first end-to-end data driven weather prediction system that offers complete replacement of the numerical weather prediction pipeline. "The simplicity of this system both makes it both easier to deploy and maintain for users already running NWP, and also opens the potential for wider access to running bespoke NWP in areas of the developing world where agencies often lack the resources and expertise to run conventional systems. There is also significant potential in the demonstrated ability to fine-tune bespoke models to maximise predictive skill for specific regions and variables. This capability is of interest to many end users in areas as diverse as agriculture, renewable energy, insurance and finance." Happily Nature has provided open access to the preprint so you can read more detail there. https://lnkd.in/gJXSS8BS

  • View profile for Claudia Luiza Manfredi Gasparovic

    Doctor in Environmental Engineering | Constructal and Regenerative Design for planet-positive climate tech

    2,288 followers

    What people don’t get about technofixes …could fill books, yes. Unintentional consequences, the mismatch between a mechanistic paradigm and the real world, etc… But there’s something people don't get about timing. I realized it this week. Bill Baue and Kasper Benjamin Reimer Bjørkskov published excellent posts about the 5 preconditions of decoupling (worth checking out!). People commented to the effect of, ‘New technologies may make decoupling possible. We don’t know the future.’ That exemplifies a common mistake. What crosses your mind when you read the word 'technology'? Smartphones, Silicon Valley, AI? I bet it wasn’t an industrial plant. People are used to seeing technology as a field of rapid advancements. But that’s not how clean technologies work at all. We can’t fix the climate crisis with software. Instead, climate technologies are ‘deep tech’, or ‘hard tech’. Tech that relies on scientific discovery or engineering innovation. That takes time. I know it, because I’m a clean tech researcher. I spent 5 years during my PhD developing a carbon capture reactor. (Yes, really...) So we’re not talking about a room full of coders in San Francisco. We’re talking about PhD students sweating away in labs for literal years to reach a breakthrough. Once they do, they will have an innovation around Technology Readiness Level 3 (in NASA’s system): an experimental proof of concept. Getting it to market is TRL 9. It requires real world validation, prototypes, a first-of-a-kind plant, setting up for commercialization... Climate technologies based on today’s knowledge will take about seven years to achieve scale (McKinsey) [1]. If we’re talking from the lab bench, it’s more like a decade. So let’s do the math together. Limiting warming to 1.5C means cutting emissions in 48% (wrt 2019) by 2030 (as per the last IPCC report). For a climate tech startup to be deploying in 2030, they would have to have started… four years ago. Technologies relevant for solving the climate crisis on time are already in development. Now, as you read this. Perhaps, centuries from now, we will be using technologies we don’t dream of today. But that won't be solving anything, that will be damage control. It’s also a huge risk to rely on them. Even if I wanted to bring my carbon capture device to market, the chances I could achieve that are slim. Deep tech is much more risky for investors. Getting funding is hard. And 90% of startups fail [2]. Now imagine this challenge in a world disturbed by climate change. And we haven’t even discussed other planetary boundaries… I’m not saying technological development is pointless. We need clean technologies. And we need to redesign them to make climate solutions into planet solutions. If I didn't believe that, I wouldn't be working on it. But hope is not a strategy. A strategy is working with what we have, and designing policy and interventions accordingly. We don't have time for 'what might come'.

  • View profile for Shimon Elkabetz

    CEO at Tomorrow.io | Weather Security is the new cyber security

    4,690 followers

    Most weather forecasts rely on complex data assimilation (DA) to fill in the gaps between sparse observations. But DA is inherently limited, relying on incomplete inputs, delayed updates, and assumptions. It's like watching a movie with only a handful of frames and guessing what happens in between. We're taking a fundamentally different approach: We don't guess. We observe. The attached visualization captures one day of observations from three of our operational satellites (we now have six!), scanning the globe every five minutes at 2.5km resolution. While traditional systems wait for DA cycles to process incomplete data, our satellites enable real-time inference—where AI models generate forecasts from actual observations, not estimates. By the end of 2025, our global constellation will achieve sub hourly revisit rate, eliminating remaining blind spots, and will deliver truly global, real-time weather intelligence. The tech industry is learning what we already know: in the age of commoditized AI, proprietary data is the ultimate moat.

  • View profile for Charles Cozette

    CSO @ CarbonRisk Intelligence

    8,351 followers

    Carbon removal research explodes 17% annually but misses deployment reality. ML analysis uncovered 28,976 studies on carbon dioxide removal, quadrupling previous estimates. This research grows 17% annually, outpacing climate science overall at 12%. Yet massive knowledge gaps remain. The findings expose stark mismatches: 56% of research focuses on biochar, while 99.9% of deployment uses forests. China dominates research (30%) but concentrates on biochar. BECCS gets heavy investment (75%) but minimal study time. Only 2% of CDR studies make it into IPCC assessments, suggesting critical knowledge isn't reaching policymakers. Research concentrates on specific regions and methods while ignoring others. Despite deployment needs, Ocean alkalinization, enhanced weathering, and place-specific studies remain understudied. The point is to realign research priorities with deployment realities. Climate change intensifies while research and practice live in parallel universes. By Dr. Sarah Lück, Max Callaghan, Małgorzata Borchers, Annette Cowie, Sabine Fuss, Matthew GIDDEN, Jens Hartmann, Claudia Kammann, David Keller, Jan Minx, et al.

  • View profile for Jozef Pecho

    Climate/NWP Model & Data Analyst at Floodar (Meratch), GOSPACE LABS | Predicting floods, protecting lives

    1,616 followers

    🌍 Climate scientists often face a trade-off: Global Climate Models (GCMs) are essential for long-term climate projections — but they operate at coarse spatial resolution, making them too crude for regional or local decision-making. To get fine-scale data, researchers use Regional Climate Models (RCMs). These add crucial spatial detail, but come at a very high computational cost, often requiring supercomputers to run for months. ➡️ A new paper introduces EnScale — a machine learning framework that offers an efficient and accurate alternative to running full RCM simulations. Instead of solving the complex physics from scratch, EnScale "learns" the relationship between GCMs and RCMs by training on existing paired datasets. It then generates high-resolution, realistic, and diverse regional climate fields directly from GCM inputs. What makes EnScale stand out? ✅ It uses a generative ML model trained with a statistically principled loss (energy score), enabling probabilistic outputs that reflect natural variability and uncertainty ✅ It is multivariate – it learns to generate temperature, precipitation, radiation, and wind jointly, preserving spatial and cross-variable coherence ✅ It is computationally lightweight – training and inference are up to 10–20× faster than state-of-the-art generative approaches ✅ It includes an extension (EnScale-t) for generating temporally consistent time series – a must for studying events like heatwaves or prolonged droughts This approach opens the door to faster, more flexible generation of regional climate scenarios, essential for risk assessment, infrastructure planning, and climate adaptation — especially where computational resources are limited. 📄 Read the full paper: EnScale: Temporally-consistent multivariate generative downscaling via proper scoring rules ---> https://lnkd.in/dQr5rmWU (code: https://lnkd.in/dQk_Jv8g) 👏 Congrats to the authors — a strong step forward for ML-based climate modeling! #climateAI #downscaling #generativeAI #machinelearning #climatescience #EnScale #RCM #GCM #ETHZurich #climatescenarios

Explore categories