
SELF-PACED PROJECT
Personal Theory of Impact
There are no easy answers for ensuring the future goes well. Focus on the specifics of how you can contribute, so you can start taking action.
SELF-PACED PROJECT
Personal Theory of Impact
There are no easy answers for ensuring the future goes well. Focus on the specifics of how you can contribute, so you can start taking action.

Our 7,000+ alumni work at
Who this project is for
You understand the landscape. Now you need to figure out what you personally should do about it.
You've completed the AGI Strategy Course (or equivalent) and understand the threat pathways, actors, and challenges. But understanding the problem and knowing how to contribute are different things.
What this looks like
This project takes you from "I want to help" to "here's what I'd spend 6 months working on and why." You'll produce a brief that crystallises your thinking through action, not just reflection.
You have ideas, but you're not sure which ones matter. It's normal to be confused!
You've read a lot, maybe taken courses, and have a growing list of areas that seem important. But the more you learn, the less certain you feel about where you'd actually be useful.
What this looks like
This project pushes you to resolve confusion by doing — talking to people actually working on the problem, testing your fit, and narrowing down where you can be most impactful with more structure.
You're done deliberating. You want a process that forces clarity.
You have skills and experiences which you believe are relevant to AI safety. But you haven't committed to a specific path because you'd be most impactful.
What this looks like
You'll talk to people in the field, test your ideas, and stress-test your assumptions. By the end, you'll have a specific, well-reasoned case for what you should be working on.
How this course will benefit you
Produce a focused brief
Create a 1-2 page brief on what you'd spend 6 months working on full-time. The process of writing it is where the clarity comes from.
Gain clarity by taking action
Don't just think — do. Talk to people working on the problem, build quick prototypes, write up your reasoning. The fastest way to figure out where you fit is to go out into the world and test it.
Build on your unique expertise
Lean into your domain background to figure out where your skills are most needed. Go deep on specific problems rather than staying abstract.
Project information
Commitment
You will spend at least 20 hours over 2 weeks. You will:
• Orient yourself to the existing literature about your chosen area
• Talk to people who are already working on the problem
• Quickly test what it means to contribute in this area
Format
This is currently a self-paced project. A guided version with coaching is planned — check back in April 2026.
Price
This course is freely available.
Frequently Asked Questions
Yes, or demonstrate equivalent understanding of the AI safety landscape. If there's a domain course relevant to your area (e.g., Technical AI Safety, AI Governance, Biosecurity), we recommend taking that first too, so you have a larger surface area of ideas to work with.
The project sprint is for people who want to make a technical contribution to AI safety research or engineering. This project is for anyone who wants to figure out how they can best contribute — technical or otherwise.
A 1-2 page brief with specific things you would spend 6 months full-time working on and why, plus a log of:
• Who you've spoken to and what you learned
• Things you tried or built, and what this taught you
• What you've read and ideas you engaged with
At least 20 hours over 2 weeks. You need enough time to research, talk to people, and test ideas. Gaining clarity is an active process that requires you to take action.
We're a London-based startup. Since 2022, we've trained 7,000+ people, with 100s now working on making AI go well.
Our courses are the main entry point into the AI safety field.
We're an intense 7-person team. We've raised $35M in total, including $25M in 2025.








