
COHORT-BASED COURSE
AGI Strategy
Optimistic and concerned about AI's trajectory? Want to do something about it? Start here. 25 hours to understand the strategic landscape, find your entry point, and get moving.
COHORT-BASED COURSE
AGI Strategy
Optimistic and concerned about AI's trajectory? Want to do something about it? Start here. 25 hours to understand the strategic landscape, find your entry point, and get moving.

Our 8,000+ alumni work at
Who this course is for
You've read some essays, watched the talks, and you don't think the people building AGI have a serious plan for making it go well. You want to change that.
The course is an in-depth introduction to what's going on with AI development, what the good and bad outcomes could be, and what could be done to steer AI towards better futures.
It's built for three groups: 1) domain experts in policy, security, operations, or engineering looking to redirect their skills; 2) people heading into technical safety or governance roles who want the strategic picture first; and 3) newcomers who are serious about making big moves and having a huge impact.
Not sure you fit? Apply anyway. Recent cohorts have also included teachers, lawyers, engineers, and community organisers.
How this course will benefit you
A launchpad for your AI safety career
You'll leave this course with an opinion on which threats matter, early takes on how we could solve these problems, and concrete next steps you can take.
A clear way to think about the future of AI
You'll analyse the incentives facing AI companies. You'll develop "kill chains" to analyse the threats. And you'll apply defense in depth to evaluate and prioritise interventions. You'll know enough to hold your own in rooms with experts.
A community of builders
BlueDot has 7,000+ alumni, with many now working at Anthropic, DeepMind, UK AISI, and dozens of organisations working on a safe transition to advanced AI. You'll meet people in the field who can open doors for you and pressure-test your thinking.
What happens after
This course is where you get oriented. What comes next depends on you.
Technical AI Safety
Interpretability, evals, alignment research. For people ready to work on the technical problems.
Explore the course→AI Governance
Policy, institutions, international coordination. For people shaping how these systems get governed.
Explore the course→Biosecurity
Pandemic preparedness, early warning systems, policy. For people building defences against bio risks.
Explore the course→Rapid Grants
Small, fast funding for concrete AI safety work. Five-minute application, decisions in days, money upfront by default.
Explore program→Career Transition Grants
Funding to enable you to work full-time on impactful AI safety work. Propose your plan and we'll back you.
Explore program→
How the course works
Options
Commitment
Complete 3 hours of reading and writing, and join ~8 peers in a 2-hour Zoom meeting to discuss the content.
Facilitator
Price
Schedule
"We should not underestimate the real threats coming from AI [while] we have a narrowing window of opportunity to guide this technology responsibly."

Frequently Asked Questions
We don't care about your CV. We care about what you'll do next. Recent cohorts have included people from policy, engineering, law, medicine, operations, and academia. What they shared was drive and a bias toward action.
It's for people new to working on AI safety, not new to thinking hard. If you've been reading and thinking, and are ready to act, this is where you start.
Same content, different pace. Intensive is 5 days at ~5h/day, for people who can clear a week and want to move fast. Part-time is 5 weeks at ~5h/week, for people fitting this around other commitments. Both end in the same place.
We run discussions across a wide range of timezones. You'll tell us your availability and we'll put you in a group that works for your current schedule.
That's what the course is for. You'll leave with a view on which problems matter most and which path fits your skills: technical research, governance, biosecurity, or building something new. Figuring that out is the work.
Yes. See bluedot.org/programs for current grants and how to apply.
Yes. Participants who complete the course receive a digital certificate they can share on LinkedIn or with employers.
Yes.
BlueDot is the leading talent accelerator for beneficial AI and societal resilience. We run courses, help people land jobs, organise events around the world, and back people starting new organisations. We've trained thousands of people since 2022. Our alumni now work at Anthropic, DeepMind, UK AISI, and have founded new organisations working on a safe transition to advanced AI.











