Frontier AI Governance visualization

COHORT-BASED COURSE

Frontier AI Governance

Governments are making decisions about AI. They don't have enough people who get it. You could be one of them.

Our 8,000+ alumni work at

  • OpenAI
  • Anthropic
  • Google DeepMind
  • AI Security Institute
  • United Nations
  • Amnesty International
  • Time
  • NATO
  • OECD
  • Stanford HAI
  • Apple
  • Harvard Kennedy School

Who this course is for

Technical people considering governance

You understand how these systems work - you've built, shipped, or founded. You're considering whether to point those skills at policy. Engineers, PMs, and founders have made this move and now sit at AISI, NIST, GovAI, and lab policy teams. You'll leave with the political judgment to match the technical, and a clear read on which roles have leverage.

Serious early-career people

You've engaged seriously with AI - through our AGI Strategy course, a university group, or your own reading - and you're weighing fellowships, grad school, law school, or roles you haven't fully mapped. Alumni from this track have gone to Horizon, GovAI, AISI, and lab policy teams; many decided their path during the course. The cohort becomes a network that outlasts it.

Professionals with institutional knowledge

You have a career - policy, national security, economics, law, diplomacy, intelligence, journalism, finance - and you can see AI is about to reshape it, and everything else too. Your goal isn't to switch fields. It's to become the person your beat, your agency, your country turns to on the risks and opportunities of AGI.

How BlueDot supports you beyond the course

FAIGC is one course in a wider BlueDot pipeline. During the course, we learn enough about participants to point them toward what makes sense next. Outside BlueDot, that often means introductions - to hiring managers at AI safety organisations or fellowship leads. Inside BlueDot, it means our other programs: 1-1 advising, Rapid Grants for concrete projects, Career Transition Grants for full-time pivots, Incubator Week for founders, or the Technical AI Safety Project Sprint for technical builders.

The AGI Strategy Course is the upstream prerequisite; jurisdiction- and domain-specific courses are in development. About 8,000 alumni are in our Slack - job openings and policy debates come through daily.

Where alumni go

  • Build something new

    Some come out of the course ready to launch: a project, organisation, or research bet. We back policy entrepreneurs.

  • Fellowships

    Horizon, GovAI, IAPS, TechCongress, and the strategy streams of MATS and Astra are the obvious next steps. We are often upstream; alumni from earlier cohorts have placed into all of these.

  • Government and policy roles

    AISI, NIST/CAISI, OSTP, congressional offices, the EU AI Office, OECD, UN, and frontier lab policy teams (Anthropic, OpenAI, Google DeepMind) all need technical fluency plus political judgment.

  • Research and analysis

    AI governance has a real think-and-do tank community. Many graduates work at RAND, CSET, IfP, IAPS, and CLTR.

What you'll actually do

Unit 1: Read models like a policymaker

Read a full system card alongside METR and Epoch evaluations; produce policy briefings tailored to a specific decision-maker.

View Unit 1

Unit 2: Map power

Map who has power over frontier AI - labs, governments, international bodies - and where the gaps are, including how other actors approach AI risk.

View Unit 2

Unit 3: Stress-test proposals

Survey compute governance, safety standards, liability, and international coordination. Argue for and against proposals you didn't choose.

View Unit 3

Unit 4: Govern under pressure

This is the unit most governance courses don't have. Examine competitive dynamics between labs and states, power concentration, and governance as capabilities approach and exceed human-level.

View Unit 4

Unit 5: Take a side

Pick a live debate - open-weight models, whether frontier development should be slowed, and more. Read across the spectrum, then defend a position in writing.

View Unit 5

Unit 6: Make your roadmap

Audit your skills, network, and comparative advantage. Produce a 6-month roadmap, with the expectation you'll act on it.

View Unit 6

How it works

Prerequisites

AGI Strategy course (or equivalent), high-level understanding of AI, and bias toward action.

Selection

~20-25% acceptance rate. We're looking for people who are analytical, motivated, and considering making this their life's work. If you're here to add a credential, this isn't for you.

Time

~30 hours total. 2-3 hours of readings and exercises per unit; 2 hours of live discussion with a cohort of 6-8, led by a Teaching Fellow working in AI governance.

Price

Free (pay-what-you-want).

Schedule

Intensive cohorts run for 6 days at ~5h/day. Part-time cohorts run for 6 weeks at ~5h/week. Apply by the deadline listed below.

Help build the field

We also hire Adjunct Experts and Facilitators (~5h/week) and Fellow-Researchers (20-30h/week) to teach.

Adjunct Experts and Facilitators

Work with a cohort for about 5 hours per week: lead discussions, bring current AI governance judgment, and help participants find their next step.

Apply

Fellow-Researchers

Teach while continuing governance research or field-building work, typically 20-30 hours per week.

Apply

FAQ

No. We mean frontier AI and AGI - the policy, coordination, and institutional decisions that shape whether advanced AI goes well.

No - but you need to engage with the technology seriously enough to assess capability claims. AGI Strategy or equivalent is the bar.

In part. The US is where we think much of the current leverage is. We also cover the EU, UK, China, and international coordination to a smaller extent.

Lower commitment (~30 hours vs months full-time). We're often upstream - the course helps you decide which fellowships to pursue, and we help you get there.