This is a special post for quick takes by Simon Sällström. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since: Today at 6:57 AM

Could coding bootcamps for high potential under resourced young students in low income countries that explicitly target remote employment in software engineering be the most impactful return on dollar donated?

This cost effectiveness analysis suggest that DIRECT benefits over the lifetime of programme participants amount to $130 per $1 donated.

If we also add spillover effects (earnings from remote work constitutes export, so money is injected into the local economy) then the returns are in the range of $300 of economic value per $1 donated.

With more money, many other issues become much much easier to solve ranging from malnutrition, lack of medical supplies, basic education and investments in agriculture productivity improvements or insurance.

Please let me know what you think, leave comment in the docs! Here is link to the report. NB it is merely a first draft and just a starting point for the discussion!

https://docs.google.com/file/d/1UZUVpOZ3RO02IytSugJ9iSpHqb3KaF0I/edit?usp=docslist_api&filetype=msword

This cost-benefit analysis seems to assume that the program can take credit for 100% of students' increased income after completing the program. This seems wrong to me? Also it is based on "predicted" salaries which may or may not be realized in practice.

Beyond these concerns, two main considerations that I think could turn out to be critical in evaluating a program like this are the students' counterfactuals and the leverage involved.

Regarding the counterfactual, it's possible that in the absence of this program, at least some of the students would have (a) found some other way to pay for the bootcamp or (b) gone on to do some other high-earning career besides software development. The only high-confidence way that I have seen to overcome this would be with a randomized trial, where some students get the subsidy and others don't (and then we could compare them retrospectively). The document you linked does mention a control group but it seems like the analysis is only prospective and it's not clear that the treatment group was randomized.

Regarding the leverage, it seems like this program has a mix of funders. It would be necessary to do some analysis to figure out how much credit for the results an additional funder could expect, which could be more or less than 100% depending on the circumstances (but usually less).

Also it's not clear what the room for more funding looks like for this program. To start with, is there an adequate supply of both students and teachers that this program could be scaled up?

Here are some documents about some of these topics that you might find to be of interest:

@chris, I’m not sure I’m following. This is coding bootcamp for students in Africa to further their internal capacity for economic development.

@Ian turner,

This is indeed just based on our best guesses. The idea is, as we roll out the program in practice, to update the hypothesised numbers with actuals (this is set up already), ie update our priors.

This is an RCT, hence the 100% attribution.

Yes this is merely the Cost effectiveness analysis document. Not a full research proposal! Right now it’s in the piloting stage either way, but next year our hope is to run this as a small scale pilot RCT!:) if you are interested then I can share a proposal that the specifics of the RCT, including the two stage saturated randomisation for spillover estimation!

The scale question is valid. The short answer is that the number of students interested and able to succeed, say out of a cohort of 400 graduates from our partner high schools (well performing) will be in the single digits, possible 10s.

Teacher cost is very low because we use existing online course material. We only require some tutors to assist. Also, chatGPT is incredible.

I really appreciate that you took the time to read what we’ve worked on and I look forward to more questions:)

Again, happy to share more documents if you are interested!

You can find link to the whitepaper (old version, to be updated soon) for the entire programme here

https://linktr.ee/directeddevelopment

I guess you might find it confusing if you're not familiar with the arguments for AI safety? Not saying you aren't, but just in case, I decided to link you an introduction.

I am somewhat familiar with it yes:) perhaps I have not fully appreciated the dangers of it yet (admittedly I should, given that I hang out in Oxford with the people who do this research). Will watch the video

With different discounting rate, preference and particular skillset (my skillset), I see this focus is the most impactful thing for me right now.

I would like to hear from you if/how, on the margin, you believe that the intervention we are working on would make any meaningful difference in making AI more or less a threat to humanity! 

If there are concerns, I have the ability to steer the training content, so if anything, these will be the least dangerous software engineers out there. Maybe they will replace software engineers who are less AI-safety-aware even as they take jobs from western engineers who do not receive any training whatsoever?

With different discounting rate, preference and particular skillset (my skillset), I see this focus is the most impactful thing for me right now.


I would be really surprised to see this intervention come out ahead purely based on impact, even with a very different discounting rate and skillset, but I think it's okay to make decisions that take your personal preferences into account.

I would like to hear from you if/how, on the margin, you believe that the intervention we are working on would make any meaningful difference in making AI more or less a threat to humanity! 

I don't expect it to have much in the way of capability externalities unless you specifically include the cutting edge of AI such as transformers. This seems easy to avoid as there's so much else in AI for students to explore.

If there are concerns, I have the ability to steer the training content, so if anything, these will be the least dangerous software engineers out there. Maybe they will replace software engineers who are less AI-safety-aware even as they take jobs from western engineers who do not receive any training whatsoever?

One option would be to have an introductory talk on AI Safety and then give people the option to work through the AGI safety fundamentals course if they're interested in it (I wouldn't waste the time of people who aren't keen on it).

One framing that might be useful is that Western companies are making a very big decision for humanity by choosing to bring smarter-than-human intelligence into existence with the vast majority of humanity having no say in it.

As someone not too familiar with AI safety discourse… is there a $-value estimate benchmark one can use to compare “apples to apples”?

The number we have at the moment is at about $130 in DIRECT increased lifetime earnings (relative to counterfactual) per $1 donated.

If we also include spillover effects, then this increases to about $300 per $1 donated.

Again, there are many detailed assumption made to arrive at these numbers, and you are very welcome to point out which ones you believe are unreasonable!

And the purpose of our pilots and proposed RCT (later) is of course to test this in practice.

I like the idea of having an introductory AI safety lecture. We’re actually planning out the guest lectures for the introductory bootcamp right now. Would you be interested in doing 1 hour on this topic? Or if not, could you refer me to someone?

Right now we do have one lecturer talking about how one can use AI in ones work as a software web developer, as a tool. I think it would be great to also, in conjunction, have something on the safety/dangers.

Best Simon

I think there are other people in the community who could give a talk much better than I could, but I suppose I could give it a go if there weren’t any better options and it was during my waking hours.

Maybe ask in the AI alignment Slack? https://join.slack.com/t/ai-alignment/shared_invite/zt-1ug7qhc4h-OKXxhilyH9CQoK113L4IWg

Hi Simon,

If a study hasn't actually been conducted yet, I think it would be more accurate to call it a "proposed RCT" than an "RCT". The cost effectiveness analysis is not based on an RCT, as I understand those terms.

Have you looked in the academic literature to see if there has been some research on similar programs? Perhaps someone has already run a trial. In my opinion, high-quality evidence about a similar program is more reliable than hypotheticals about a particular program.

Also, may I ask, what caused you to focus in on this particular program vs something else?

Ian

Hi again Ian! 

Yes you are correct I could have clarified in my reply to you that there is no RCT yet. 

A CEA doesn't have to be based on RCT data. As long as the assumption being made a clear then it is up to the reader to accept or reject those and evaluate the merit of the CEA on that basis. I think you may be confusing a (theory-based) estimation with the actual evaluation, which is our fault as I also see that this distinction is not entirely clear. 

In other words, even if we did not have it in the roadmap to conduct an RCT in order to validate the assumptions used in the CEA, the CEA itself can use the terminology "control" in order to make it analytically clear that the "control" refers to a group which differs only from the "treatment" group in that it did not attend the programme.


Regarding existing studies. There are not really any that I have found. I am in talks with some of my professors here in Oxford who were looking into doing something somewhat similar, but still quite different (simple IT gig work, of the Amazon Turk type)... 

More generally, the devil are in the details for programmes like ours. Only because someone, somewhere, has evaluated a X week-long training programme in IT in country Y doesn't mean that this generalises in any meaningful way to what we are doing. Analogously, say you have a social media website in year 2005 where users can create profiles and connect with each other. Will this company be valued at $1B, $10B or be bankrupt in 10 years? It all boils down to the people, execution and many small details. 

We target students right after high school in Kenya, have partner high schools (high performing schools), make use of 2-week milestone assessments/conditional cash transfers, have direct contact with western tech companies with an end-to-end pipeline from 0 to internship to job and focus on state-of-the-art MERN stack Javascript development. All of these matter for the end-impact. 

For exampl, Educate! do vocational training programmes and say that their programme have measurably impacted 250,000 Educate! | Preparing youth in Africa with the skills to succeed in today’s economy. (experienceeducate.org)
"Educate! tackles youth unemployment by partnering with youth, schools, and governments to design and deliver education solutions that equip young people in Africa with the skills to attain further education, overcome gender inequities, start businesses, get jobs, and drive development in their communities." 

Now their focus is very different from ours. They target a population that is unemployed. We target a population that is not unemployed (they are between high school and university).

Here is an example of an intervention that was implemented by the same NGO, in the same country yet had 0 measured impact, despite strong RCT evidence suggesting otherwise prior to this. The Comparative Impact of Cash Transfers and a Psychotherapy Program on Psychological and Economic Well-being (nber.org). Podcast episode where I interviewed the author https://open.spotify.com/episode/6PNL8nJ5acgAWhIuVThym0?si=d780a64f12644f9a 

Regarding why I focus on this. I can write a lot about the personal journey and the entire process/long form argument for it, but in short, I think it has the potential of being the most cost-effective (development) intervention there is. Why? Because the greatest alleviation of large scale suffering has historically always been grounded in a strong economy and specifically a flourishing export industry. Remote work is an export industry and we are now in a position to help upskill this industry through knowledge transfer at a very low cost (since all the info/content is out there already, we just need to structure it and match the talent with opportunity).  

Happy to elaborate if you are at EAG London. 

Hi Simon,

I agree that an RCT is not needed to create a cost-effectiveness estimate. To me when you wrote "This is an RCT" that implied that it was indeed based on RCT. That is why I suggested language such as "proposed RCT" to make it clear that the RCT is hypothetical at this point.

I agree that when it comes to programs such as educational interventions context can be very important. But isn't it worth asking whether there is good evidence that these sort of programs are generally effective across contexts? Or to put it differently, without evidence about a specific context, what is the base rate of effectiveness?

A quick Google reveals Subsidizing Vocational Training for Disadvantaged Youth in Colombia: Evidence from a Randomized Trial by Attanasio et al., and The Labor Market Impact of Youth Training in the Dominican Republic: Evidence from a Randomized Evaluation by Card et al. The former study found a modest effect comparable to GiveDirectly and the latter found a small to null effect. Have you looked at these studies?

I agree that growth of export industries has frequently been a strong source of economic development; but that doesn't mean that we should assume that it would be easy to speed up through philanthropic activity. Effective altruism means being skeptical and humble about what is or is not realistic to achieve.

Thank you for pointing it out. I was sloppy in my wording!:)

As I mentioned earlier, I was not able to find any relevant studies with transferable insights, unfortunately. There is ample literature on primary school or secondary school interventions, or general vocational training programmes. But there’s non that

  1. Target digital remote employment in low income counties (simply because that wasn’t feasible from an infrastructure point of view)
  2. Do NOT target those who are already unemployed.

To be more specific. The studies you cite here are simply not relevant. It’s as if I come and suggest medical intervention X to combat diseases, and you find study A, B and C from 10 years back that used intervention Y to also combat diseases. On a very superficial level they may seem similar in the same way studying at Harvard university is similar to studying at Södertörn university. But they are hardly particularly useful in producing useful proxies for cost effectiveness of our particular intervention.

First of all, they are NOT digital skills programs. And as an aside, this type of intervention wasn’t even possible to do just 4-5 years ago because the internet infrastructure or otherwise just didn’t exist in Kenya or Ethiopia back then.

Second they are targeting currently unemployed youth or less educated youth. We do not target this group. Our intervention targets the upper segment of highly talented individuals - and most of which won’t have the resources or access to the top quality training needed to succeed.

Third, the amount of resources invested is comparatively low relative to ours. Our intervention is definitely not as scalable as some of these programs (that are intended to scale) and we instead focus on intervention targeting efficiency. That means we invest a lot more per individual and we invest much more in selection of whom we support. This is inspired by on existing research on the heterogeneity of the effectiveness of microcredit by Banerjee et al (2018), Gung ho entrepreneurs paper.

Instead, I propose that in our case, it is much more informative to look at current market data on (a) how much remote employed software engineers earn, (b) what they need to learn in order to get these jobs.

To give you some numbers, this comprehensive stackoverflow survey with 100.000 respondents from 2018 reveals that, amongst the 55% who enrolled in a coding bootcamp without already having a jobs as developers, 71.5% found jobs as professional developers within 6 months (n=6652).

With that said, we make several assumptions in the CEA and I’d love to get informed critiques of those assumptions so we can adjust and change them to be more realistic. We’ve tried our best to find good data but that itself takes time and a lot of effort. We are in the process of rolling out survey of former students and of working professionals to figure out both counter factual earnings of comparable students from earlier years from the same schools and from people who work as remote engineers. Our current estimates are in the range $300-$600/month and are based on informal surveying in both countries. Anecdotally, the ones who do get jobs have often learn the frameworks and languages themselves using pirated versions of Udemy or similar sites (even if they have CS degrees).

However, given that US software engineering entry salaries are at $12,500/month, there’s clearly ample room for potential.

I don’t think it’s a It’s not a matter of whether some of the brightest talents in Africa can compete with these 6-figure position jobs. It’s a matter of asking “what does it take” for them to get there.

Thank you for keeping the conversation going! It’s very helpful as I’m forced to flesh out my arguments. This will help prepare a long form post at some point later on!:)

Best Simon

I guess at this late stage of AI where we're far, far behind where we are on safety, it would seem pretty strange to create large scale coding bootcamps to benefit individuals economically, when we don't even have this for AI safety people or EA's looking to work on EA project/earn to give.

Internship / board of trustees!

My name is Simon Sällström, after graduating with a masters in economics from Oxford in July 2022, I decided against going on the traditional 9-5 route in the City of London to move around money to make more money for people who already have plenty of money… Instead, I launched a charity

DirectEd Development Foundation is a charitable organisation whose mission is to propel economic growth and develop and deliver evidence-based, highly scalable and cost-effective bootcamps to under-resourced high-potential students in Africa, preparing them for remote employment by equipping them with the most sought-after digital and soft skills on the market and thereby realise their potential as leaders of Africa’s digital transformation.

I'm looking for passionate people in the EA community to join me and my team!

We are mainly looking for two unpaid positions to fill right now: interns and trustees. The latter is quite an important role

I am not entirely sure how to best go about this which is why I am writing this short comment here. Any advice? 

Here's what I have done so far in terms of information about the internship position and application form: https://directed.notion.site/Job-board-3a6585f2175a456bb4f3d1149cfddba2 

Here is what we have for the trustees (work in progress):  https://directed.notion.site/Trustee-Role-and-Responsibilities-115d46fd04d94bd1a7061ca5d00f8f71 

Happy to take any and all advice:)

https://directed.dev/ 

Curated and popular this week
Relevant opportunities