2023 Impact Report Summary

By Dewi Erwan (Published on March 19, 2024)

Who we are

We’re a nonprofit startup founded in August 2022 in Cambridge, UK, now based in London. We’re trying to build a world where decision-makers are equipped to ensure powerful new technologies benefit all of humanity. We do this by designing courses focused on the most consequential emerging technologies.

Since 2022, over 2,000 people have graduated from our AI safety courses, including individuals at major AI companies, top universities, and relevant governments. In 2023, we also ran a pilot Biosecurity Fundamentals: Pandemics course with 100 medical, public health and synbio professionals from across the world.

Our theory of change

Find great people

We aspire for our brands (BlueDot Impact, AI Safety Fundamentals, Biosecurity Fundamentals) to attract the most motivated, thoughtful and action-driven people in the world. We succeed in our mission by identifying people who could have a massive impact and supporting them to do so. Our courses are all part-time (<5 hours/week for 2-3 months), as we expect the world's most ambitious and skilled people to already be in time-intensive roles.

Motivate and equip

The courses create a space for students to evaluate arguments for why a given topic matters, equip them with frameworks to assess risks and interventions, and provide an overview of the current landscape. They’re designed by subject matter experts and are updated regularly. Students are grouped together into small cohorts and work through the curriculum together guided by a facilitator. The cohort discussions are a space to clarify uncertainties, debate opinions and develop relationships. They’re also an accountability mechanism to engage with the resources and exercises in the curriculum.

Connect to opportunities

The course's impact relies on alumni taking action in the world to mitigate catastrophic risks and create a better future for humanity. At the end of each course, students are encouraged to consider what actions they could take to contribute to the field. We also work with recruiters at partner organisations to identify promising leads from our alumni for their open roles.

Before and after

I’d like to provide you with a before and after snapshot of 2023, to demonstrate our progress.

At the start of 2023, our courses were hosted on a static webpage. Updating the course was a clunky process, and students couldn’t interact with the resources or exercises in a way we could track. We had no idea if students were completing the pre-session resources, and we weren’t tracking attendance at the sessions (self-reporting was patchy). Every time a student wanted to move to a different cohort, it took 5-10 minutes for us to do it manually. The project sprints didn’t exist, and running a course involved significant operational overhead. We didn’t do any marketing or promotion, and relied entirely on word of mouth.

By the end of the year, we’d redesigned the AI Alignment course and designed new AI Governance and Pandemics courses, all using findings from the science of learning. We developed a cohesive design language for our separate brands and built new websites for each of them. Our new Course Hub hosts the courses, and students can tick off resources when they complete them. We built a custom video conferencing solution to track session attendance and provide each cohort with a unique link and meeting room. Our cohort switching tool fully automated the cohort switching process, updating the records on our back-end and adding people to new Slack channels and calendar events. The project sprints went from totally unstructured and not helpful during the Feb 2023 AI Alignment course to facilitating significant collaboration and high-quality project work during the Oct 2023 Pandemics course. We can now launch a new course every month instead of every 4-5 months, each course is a better experience for students, all while having the same number of team members.

In addition, we estimate >100 local groups worldwide use our curricula to run their own courses (without us actively encouraging this, though here’s some guidance we created recently), and we’re providing training materials on AI safety for UK policymakers.


Here’s a collection of testimonials from students, to illustrate the value they received from the courses. You can see more in our public testimonials.

“I have found myself being able to hold my own in AI governance conversations with people who are experts by all measures. I think I have a much better mapping of potential risks now than I did going in.”

“I made a particularly useful connection during the project weeks as we decided to work on a project together. We weren't in the same cohort, but connected through the Slack channel. I plan to work with them in the future as we are both interested in AI Governance research.”

“I had one-on-ones with some of the other participants, and people externally reached out to me about me taking the course, and I feel that we are in the fight together and will in the future overlap again in significant ways. I met some of them in-person at conferences!”

Plan for 2024

In 2024, we’re doubling down on our online courses, ramping up our organisational cadence to launch a course every month. During the courses, our priority is supporting top students to find impactful opportunities after they graduate. We’ve started to hire people on short-term contracts as Teaching Fellows, responsible for facilitating 5-10 cohorts each, to help us scale the courses and improve discussion quality.

We use analytics cookies to improve our website and measure ad performance. Cookie Policy.