What if the next AI breakthrough gave machines a more grounded sense of understanding? The 2025 State of AI report highlights AI2’s Molmo-Act, an “action reasoning” model that plans in 3D physical space. Given a high-level command, Molmo-Act generates intermediate visual or geometric waypoints (like depth or trajectory sketches) before translating them into continuous robot motions. It’s an intriguing step toward systems that can link abstract reasoning with the physical world. Combined with emerging work on long-term memory and planning, these advances hint at progress toward more contextually aware AI, not yet “human-like” in the way Yann LeCun envisions, but perhaps a little closer? Stay tuned for next year’s discussion as this evolution continues! Link in the comments for the full conversation.

Aleksandra Przegalinska

Vice Rector for Innovations and AI and Assoc. Professor@Kozminski University, Harvard CLJE Senior Research Associate & CampusAI Scientific Advisor, Member of the Research Council @European University Institute

3w

remember our conversation quite well, thank you AI House Davos for hosting us!

See more comments

To view or add a comment, sign in

Explore content categories