Hide table of contents

TL;DR: Help the AGI Safety Fundamentals, Alternative Protein Fundamentals, and other programs by automating our manual work to support larger cohorts of course participants, more frequently.

Register interest here [5 mins, CV not required if you don’t have one].

Job description

We are looking to find a low-code contractor who can help us to scale and run our seminar programmes. We have worked with an engineer to build our systems so far, but they are moving on.

Here is a list of tools and apps we use to make our programmes run:

  • The backbone of the programme is run with no-code tools (Airtable database, Make automations to interface with Slack and Google Calendar).
  • One python script which clusters participants with a similar background based on some numerical metrics (not currently integrated in Airtable, but we’d like it to be).
  • One vercel webapp to collect participants’ and facilitators’ time availability.
  • One javascript algorithm which groups ~50 previously-clustered participants & facilitators into cohorts of ~6 participants + 1 facilitator.
  • We would like (you) to introduce more extensions using similar tools in the future.

We’re offering part time, contract work

We’re looking for part-time, contract support for now. We would offer a retainer[1], and would pay an hourly rate for additional work.

Further logistical details:

  • We need more responsiveness when kicking off programs, in case there are bugs (4 weeks, ~3 times a year).
  • There will be development work to do in between programmes.
  • We are open to (and slightly prefer) full time contract work for the first couple of months to get us up and running.
  • You can let us know what would work for you in the form.

Salary

We are offering $50 / hour. Let us know if this salary would prevent you from taking the role in the form.

Were you hoping for a full time opportunity?

We're also likely to want to build out an entire new software platform for these programmes over the next 6 months, and we'll be looking for excellent software engineers who can help us build a team to achieve this.

We currently use Slack, Zoom, vercel, Airtable forms etc and tie these together using low-code tools like Airtable and Zapier. We are interested in bringing this under one platform, to enhance user experience and retention. We expect to have other software needs to manage our community and websites hosting opportunities downstream of our programmes, too.

You can register interest for the full time Head of Software here, though please note this role is not as fleshed-out as the part-time role described in this post at this point.

Why should you work on this project?

Scale

These programmes have offered the first scalable, EA-originating onboarding programmes to the field of AI safety and alternative proteins. The growth trajectory for both programmes has been excitingly steep.

The AGI Safety Fundamentals went from 15->230->520 participants. We now have 650 registrations of interest for the next round, before promoting the programme through usual channels (this is as many people who ultimately applied to round 2).

We also run the Alternative Proteins Fundamentals, which has seen 400 participants in its 2nd round, and we plan to build out many more programmes using the same infrastructure in the future.

All together, we expect the infrastructure you build in this role to support 5,000+ people in the next 1-5 years. We get that with the conservative estimate that we run 2 programmes, 2.5 times/year with 200 participants each time. Additionally, we expect to add ~10 more programmes in the coming years.

Building quality infrastructure will increase our teams’ capacity to continue to iterate and improve to deliver a quality programme.

Impact on participants

For the full details, check our retrospective on the 2nd iteration of the AGI safety fundamentals programme. Pulling out some highlights:

The programme helps people to figure out their career priorities. Programme graduates reported being more interested in working on AI safety after our programme, in general. On the individual level, some people became less likely to work on safety, after learning more about it during the programme. Self-reported in final feedback form.
Participants in our programme overwhelmingly thought that the programme was >= 2 times more time-efficient to learn about and get involved AI safety, than what they would have done otherwise. This was self-reported in our final feedback form.

Alumni

These are some anecdotal cases of what alumni in our last iteration of the programme have gone on to do:

  • At least 1 Machine Learning Engineer at 3 top software/ML companies.
  • Member of technical staff at Redwood research
  • Independent researcher
  • SERI MATS and MLAB participants, furthering their career in AI safety research
  • Access to our database of participants who gave their permission to share their information led to:
    • Research fellow at Rethink Priorities (1 offer made sourced from our database, 'major contribution' attributed to AGISF)
    • Open Philanthropy's longtermist movement building team (2 counterfactual offers made sourced from our database, resulting in 1 hire)
  • Likely more that we likely haven’t tracked

Evidence of future impact

  • We’re getting more participants interested (so far exponentially, though we expect this to level off at some point and won't necessarily scale the programmes exponentially)
  • We want to produce more content and more programs, including
    • More in-depth programmes on AI safety
    • Programmes in different fields
  • We want to package our infrastructure up so that local groups can use it to easily run a high-quality local version. (We expect local versions are better for making connections, where possible)

What would be your impact?

Our infrastructure has been a major factor preventing us from running the programme more frequently, at scale. There are aspects of it which do not scale, which uses a lot of organiser time.

We want to have the ability to process 500+ people at a time without much additional organiser overhead. This will require an overhaul of our process for e.g. allowing people to join a different cohort if they can’t make their regular slot (5-10 hours organiser time / week for 500 people), grouping cohorts (80 hours / week for 2-3 weeks at the start of the programme), etc.

We are hiring for this role to professionalise and scale our seminar programme infrastructure. With your help, we’ll be able to:

  1. Run the programme more frequently (at least 1 more time per year, per programme).
  2. Offer a higher-quality experience for participants. For example, the first thing we’d focus on is putting on more events for programme participants, e.g. networking opportunities between PhD participants, targeted advice & networking for software engineers from alignment research engineers, etc.

Your role and responsibilities

The role would involve improving or rewriting our current systems so that we can run the seminar programme as we’ve run them before. See the summary of the current system at the top of the post.

In the future, we’d like to add new features which you would develop. We’d work together to explain our requirements, and discuss potential solutions with you.

Scroll to the bottom of the post if you have more questions about the nature of the role.

Example tasks you might do

  • Add a feature that lets the participants choose and join another cohort (from a list of cohorts specific to that participant), for a week that they’re unavailable at their usual time.
  • Add a tool that groups participants based on some measures of their background knowledge

You’ll be the tech lead

We will know what we need from the product perspective, but you’ll have ownership over how to implement it.

We can’t offer technical mentorship

Since you’re the tech lead - we won’t be able to provide technical mentorship. We are otherwise willing and expect to work closely with you with regular check-ins to help you stay on-task, though.

If you can find your own mentorship or training - we are open to provide a budget for it.

What’s the application process?

Click the form to find out fastest!

We want to keep this light touch at this stage. We’ll ask for one of a LinkedIn, CV or paragraph about yourself. We’ll also ask for a brief couple of sentences about why you’re interested, and your availability.

We will follow up with respondents to find out which arrangements would suit which people, and would offer a work trial to those whom we’re mutually excited to proceed with.

AMA

Please ask us any questions in the comments, by emailing jamie@thisdomain, or submit an anonymous question / feedback here.


Special thanks to Yonatan Cale for his help and advice in creating this post.

  1. ^

    What we'd expect to offer for a retainer contract: we agree on a fixed number of hours / month (guessing 20-40 hours / month), for a fixed number of months (about 6 months), that we will pay for regardless of how much you work. We pay you in addition to that retainer if you work more hours. We expect there will be plenty of work to do, but are happy to offer this as security for you if you were to join us as a contractor.

Comments4


Sorted by Click to highlight new comments since:

Cool opportunity! I assume that EA knowledge is not a requirement at all?

EA knowledge is not required. Thanks for asking!

This seems like a really high impact opportunity, I hope that some talented people apply to this.

Sounds nice, except that um, well, AI safety is a myth.   

Curated and popular this week
 ·  · 1m read
 · 
 ·  · 2m read
 · 
Project for Awesome (P4A) is a charity video contest running from February 11th to February 19th, 2025. The public can vote on videos supporting various charities, and the ones with the most votes receive donations. Thanks to the support of the EA community, three EA charities received $37,000 each last year. Please help generate additional donations for EA charities again this year with just a few clicks! Voting is open until Wednesday, February 19th at 11:59 AM EST. You can find more information about P4A in this EA Forum post. On the P4A website, there are numerous videos showcasing different charities, including several EA charities. Feel free to watch the videos and cast your votes. Here’s how it works: „Anyone can go to the homepage of projectforawesome.com to see all videos. You can sort by charity category, pick from a dropdown of organization names, or search for a specific video. After you click on a video, look for a big red “VOTE” button either next to or below the video. You’ll have to check an “I’m not a robot” box, too.“ This year, there’s a new rule: „Our voting rule for Project for Awesome 2025 is one vote per charitable organization per device.“ So, you can vote for all the charities you want. List of videos about EA charities If you can’t find videos of EA-aligned charities directly, here’s a list: * Access to Medicines Initiative (Vote here) * ACTRA (Vote here) * Against Malaria Foundation (Vote here) * Animal Advocacy Africa (Vote here) * Animal Advocacy Careers (Vote here or here) * Animal Charity Evaluators (Vote here or here) * Animal Equality (Vote here) * Aquatic Life Institute (Vote here or here) * Center for the Governance of AI (Vote here) * Faunalytics (Vote here or here) * GiveDirectly (Vote here) * Giving What We Can (Vote here or here) * Good Food Institute (Vote here or here or here) * International Campaign to Aboli
 ·  · 12m read
 · 
TL;DR HealthLearn provides accredited, engaging, mobile-optimized online courses for health workers in Nigeria and Uganda. We focus on lifesaving clinical skills that are simple to implement. Our recent evaluation of the HealthLearn Newborn Care Foundations course showed significant improvements in birth attendants’ clinical practices and key birth outcomes. Early initiation of breastfeeding, strongly linked to reduced newborn mortality, improved significantly in the evaluation. After applying large (>10X) discounts, we estimate the course is ~24 times more cost-effective than GiveWell’s cash transfer benchmark. We are uncertain about the precise magnitude of impact, but a sensitivity analysis suggests that the program is cost-effective under a wide range of plausible scenarios. Our already-low unit costs should decline as we scale up. This is likely to increase or at least maintain the program’s cost-effectiveness, even if the impact per trainee is lower than our current point estimate. We also earn revenue by hosting courses for another NGO, which covers a portion of our core team costs and increases cost-effectiveness per philanthropic dollar spent. We have identified key uncertainties in evidence strength, sustainability of clinical practice change, and intervention reach. We plan to improve our monitoring and evaluation to assess these uncertainties and develop more precise estimates of impact per trainee. We will continue our work to improve and scale up the Newborn Care Foundations course, while also developing new courses addressing other gaps in clinical practices where impactful interventions are needed. Background HealthLearn is an AIM-incubated nonprofit that develops and provides engaging, accredited, case-based, mobile-optimized online courses for health workers (HWs) in Nigeria and Uganda. This includes one HealthLearn course (Newborn Care Foundations) and two courses (focused on epidemic preparedness and hypertension diagnosis and management) f