FAR.AI’s Post

View organization page for FAR.AI

20,114 followers

Great to partner with AI Security Institute on their inaugural Alignment Conference! Collaboration between researchers, funders, and policymakers is essential for advancing AI alignment.

View organization page for AI Security Institute

20,616 followers

Last week, we hosted our inaugural Alignment Conference, in partnership with FAR.AI . The event bought together an interdisciplinary delegation of leading researchers, funders, and policymakers to discuss urgent open problems in AI alignment. Ensuring that future AI systems act as we intend will require a rapid, cross-disciplinary expansion of the AI alignment field. Progress hinges on contributions from fields spanning cognitive sciences to learning theory. Our conference deepened this technical collaboration through five research tracks: 1️⃣ Theoretical Computer Science  2️⃣  Learning Theory & Learning Dynamics  3️⃣ Economic Theory  4️⃣  Cognitive Science & Scalable Oversight + Evaluations  5️⃣ Explainability  Learn more about AISI’s work to accelerate research in AI alignment: https://orlo.uk/KNNQK Read our research agenda: https://orlo.uk/sdJFg

  • Photo of event space
  • Two male attendees engage in an exciting discussion
  • Attendees at the event sit in small groups having animated conversations.
  • Photo of attendees networking

Exciting to see so many disciplines coming together.

Like
Reply

To view or add a comment, sign in

Explore content categories