TECHNICAL AI SAFETY PROJECT

Make a technical contribution to AI safety in 30 hours

Work with an AI safety expert to make a contribution to AI safety research or engineering. All in 30 hours.

Technical AI Safety Project visualisation

Our 6,000+ alumni work at

  • OpenAI
  • Anthropic
  • Google DeepMind
  • AI Security Institute
  • United Nations
  • Amnesty International
  • Time
  • NATO
  • OECD
  • Stanford HAI
  • Apple
  • Harvard Kennedy School

Who this course is for

For software engineers who want to contribute their technical skills to build tools for or scale AI safety research.

For early researchers who want to build their AI safety research portfolio.

For Technical AI Safety course graduates who want to build their portfolio.

Don't fit these perfectly? Apply anyway. Some of our most impactful participants have included teachers, policymakers, engineers, and community leaders. We bet on drive and ambition, not CVs.

Apply now

Curriculum Overview

How this course will benefit you

Publish a project in 30 hours

Go from extending a paper or improving research code to a published write-up.

Past participants have reproduced findings from METR, fixed TransformerLens issues, and replicated evals using Inspect. You'll publish a blog post and X thread showcasing your work.

Find collaborators and opportunities

We'll feature the best projects on our website and socials. Past graduates have found co-founders, collaborators, roles, and funding through their projects. Your write-up becomes a public signal of your skills.

Get mentorship from an AI safety expert

You'll have regular check-ins with an AI safety expert who can debug your approach, validate extension ideas, and give rapid feedback. No more spinning your wheels alone. Get answers in hours, not days.

Course information

Commitment

You will spend 30 hours working on your project. Each week you will:
• Provide regular updates on your progress
• Join ~8 peers and an AI safety expert in a 1-hour check-in to discuss your progress and get feedback

Facilitator

All discussions will be facilitated by an AI safety expert.

Price

This course is freely available and operates on a "pay-what-you-want" model.

Schedule

Meet our alumni shaping AI's future

Frequently Asked Questions

You should be comfortable coding—either through professional experience or a few fully completed projects. Our mentorship focuses on scoping and refining your project ideas, not teaching you to code. If you're not comfortable coding, you could instead work on a written project, like a blog post reflecting on a past BlueDot course or exploring different ways you could contribute to AI safety.
We designed this course as a follow-up to the Technical AI Safety course. We will be prioritising graduates from the course, but we'll also consider applicants who can demonstrate equivalent knowledge of technical AI safety concepts.
We're a London-based startup. Since 2022, we've trained 7,000+ people, with 100s now working on making AI go well.

Our courses are the main entry point into the AI safety field.

We're an intense 4-person team. We've raised $35M in total, including $25M in 2025.