Advertising to technical people: LinkedIn, Twitter, Reddit and others compared – BlueDot Impact

Advertising to technical people: LinkedIn, Twitter, Reddit and others compared

By Adam Jones (Published on August 7, 2024)

At BlueDot Impact, we run educational courses that help people develop knowledge, skills and connections to pursue high-impact careers. To find great people for these courses, we’ve been experimenting with paid advertising.

This article explores what paid advertising platforms worked well for our AI Alignment (June 2024) course. The course focuses on helping people get into technical AI safety research to reduce risk from advanced AI.

While we accept people from a range of backgrounds, our most typical participant:

  • is a working professional, usually with several years of experience
  • has a background in computer science, machine learning or software engineering
  • lives in a developed English-speaking country, often the US or UK

Given this is our most identifiable audience, we focused most of our efforts here. We also advertised a little to ML academics, and some specific AI organisations.[1] We previously advertised to university students but chose not to this round.[2]

Platforms

We advertised on 5 different platforms:

Platform Cost per 1000 impressions Cost per click Cost per application Cost per accepted application
LinkedIn £7.54 £1.67 £75.83 £227.59
Twitter £0.33 £0.20 £136.82 £345.30
Reddit £0.48 £0.17 £1,574.55 -[3]
Blind £5.63 £0.58 £804.89 -[3]
DEV Community £7.77 £9.73 -[3] -[3]

Impressions and click data is from the platforms themselves. Application data is based on people who submitted an application, weighted by an attribution factor.[4] Because the course is still in progress we don’t have data on completion rates or post-course outcomes yet.

In general, this was a much better performance compared to previous rounds of the course. Paid marketing performed poorly for our March 2024 course, and I’m glad we’ve come a long way in just a few months (the data above is for our June 2024 course).

This is still a lot more expensive than our advertising for our Biosecurity Fundamentals course. We haven’t done detailed analysis here, but think it’s at least double the cost per click and application. We think this is because machine learning and software professionals are more heavily targeted by advertisers (compared to bioscience professionals), which pushes bid prices up significantly.

Platforms’ conversion tracking was poor. We mitigated much of this by asking applicants where they heard about us, using UTM parameters in links, and using our own standardised analytics tools. The conversion data inside platforms should not be trusted:

  • LinkedIn didn’t track conversions for EEA users. This is not mentioned in their documentation, or in the interfaces when viewing the data, which led to a lot of confusion for us when trying to match up their data with ours. We think this is probably due to data privacy compliance concerns, but doesn’t quite make sense given that they will track UK users (who fall under effectively the same legislation given the UK GDPR is retained EU law after Brexit).
  • Twitter’s conversion tracking didn’t work at all. Additionally, their dashboards frequently show incorrect figures that don’t line up with other parts of the interface.
  • Reddit’s conversion tracking somewhat worked. However, the conversion figures were much lower than our own analytics. We think this might be because the follow-through tracking is blocked by many common ad blockers, which we think Reddit users are likely to have installed.

LinkedIn applicants were slightly higher quality than other sources. The acceptance rate for people from LinkedIn was higher compared to other sources, and the proportion of people we were particularly excited about (that we marked internally as ‘strong yes’) was higher. We think this might be because:

  • people on LinkedIn might be more serious about their careers, compared to other more ‘casual’ social media platforms; or
  • LinkedIn’s targeting features are much more powerful than other platforms, allowing us to filter for better talent.

Anecdotally, people working on AI safety told us they saw the Twitter and Reddit ads. This suggests we targeted the right audience - tech people who might be able to contribute to AI safety. (Although it could also be argued this is the wrong audience: we want people who are not already working on AI safety, so that we can help them get into it).

On Reddit, we should improve the text copy before advertise there again. Reddit is much more focused on the text copy than the image, and we hadn’t optimised this well. A slightly positive sign was that impressions and clicks were cheap, but perhaps were poorly selected by the poor copy. It may also be the case that Reddit always has poor click-to-apply conversion rates even with good copy: in which case we shouldn’t advertise there again.

We won’t advertise on Blind or DEV again at the same prices. Both were quite expensive, and didn’t generate good applications. We think it was still a useful learning experience and experiment to do, especially given we allocated only a small amount of our budget to these platforms. We’ve published a separate blog about our learnings advertising on DEV.

Other areas for improvement

We also experimented with two more niche communities: Blind and Dev. These didn’t work as effectively as we had hoped. However, we think trying more platforms may still be worthwhile. Other people have found Facebook and Instagram useful for attracting people to AI safety.

And while this article has focused on paid ads, we also did other outreach: for example through connections we have in the field, university group mailing lists, and exploring partnerships with different organisations. However, we did this quite late. Starting earlier would give the message more time to get to people, and allow people more time to apply.

Also see

If you enjoyed this article, you might also want to see our articles about:

Footnotes

  1. All the platform’s targeting abilities are far from perfect, so implementations are fuzzy. LinkedIn was most specific, where we created 6 audiences:

    • Machine learning professionals
    • Software engineers, with some machine learning skills
    • Tech professionals (software engineers, data scientists, technical product managers etc.) interested in AI, and open to switching jobs
    • Machine learning academics
    • Specific AI organisations, for example people already in frontier AI companies
    • Retargeting, i.e. people who had already engaged with us on LinkedIn previously
  2. University students are much cheaper to market to, but tend to be much lower quality participants than working professionals.

    Undergraduates have much worse attendance, submit poorer quality projects, and are less likely to go on to working in the field. They’re also more likely to have alternative opportunities to learn about our topic areas, for example through university groups and societies.

    Of course there are exceptions, and we are keen for great students to apply! But previously we’ve seen that the low quality of student applications is more significant than the reduced cost of marketing to them, so decided not to focus on them this round.

  3. No matching applications came from this source.

  4. Our website tracks applications from adverts to applications with UTM parameters. The application form also asks people where they heard about our course.

    We use this information to attribute different marketing sources. For example, if someone has a UTM parameter suggesting they came from LinkedIn, and in the form they explain ‘I clicked a LinkedIn ad, but I heard of your course from a friend before’, we’ll consider LinkedIn to have helped us gain 0.7 applicants.

    This isn’t perfect. We have to make assumptions about how much to attribute different statements. Although we do also try to test this empirically too sometimes – see our blog about attributing people who generically say LinkedIn.

    Even if sources were clear-cut, there are also technical problems with tracking. This can result in errors that:

    • underestimate conversions: people click an ad, remember it, and apply later via a different link
    • overestimate conversions: people copy a link with UTM parameters to share with others

    (asking people in the application form helps mitigate this somewhat, but people can still sometimes be vague)

We use analytics cookies to improve our website and measure ad performance. Cookie Policy.