Hey everyone! I'm Ben, and I will be doing an AMA for Effective Giving Spotlight week. Some of my relevant background:

  1. In 2014 I cofounded a company for earning to give (EtG) reasons (largely inspired by 80k), which was later successfully acquired.
  2. Since late 2018 I have been doing direct work, currently as Interim Managing Director of CEA.
    1. (With a brief side project of founding a TikTok-related company which was similarly acquired, albeit for way less money.)
  3. I've had some other EtGish work experience (eight years as a software developer/middle manager, a couple months at Alameda Research) as well
  4. Additionally, I’ve talked to some people deciding between EtG and direct work because of my standing offer to talk to such folks, so I might have cached thoughts on some questions.

You might want to ask me about: 

  1. Entrepreneurship
  2. Trade-offs between earning to give and “direct work” 
  3. Cosmetics and skincare for those who (want to) look masculine
  4. TikTok
  5. Functional programming (particularly Haskell)
  6. Or one of my less useful projects
  7. Anything else (I might skip some questions)

I will plan to answer questions Thursday, November 9th. Post them as comments on this thread. 

See also Jeff’s AMA, which is on a similar topic.




Sorted by Click to highlight new comments since: Today at 5:15 PM

How do you time your donations? E.g. do you donate as soon as possible or do you do "patient philanthropy" and invest the money to donate later?

How do(did?) you decide how to allocate your donations across cause areas?

Given the current funding situation and talks about funding diversification, do you think that EtG is ~2x more valuable compared to when you wrote this, or is it closer to ~10x more valuable?

Thanks for the questions!

  1. I put donations into a DAF immediately after the acquisition, for tax reasons, but I think have distributed only ~20% so far. I also calculated that it was more tax effective to cut my salary than to donate to CEA and have that money go to my salary, so I did that (and didn't put some money into the DAF as a result).
  2. I think my cause-area-level strategy is basically "donate to the things where I disagree with OpenPhil" which more or less cashes out to "animals + downside-focused". (Additionally there are some things that I think OP finds hard to fund for non-impact reasons, e.g. I gave a small donation to a political campaign.)
  3. I want to be annoying and say it really depends? E.g. if you are a lobbyist, my guess is that EtG looks worse this year than it did last year (because the opportunities to have direct impact through lobbying seem to have increased more than the funding has decreased). Some low confidence takes that admit a bunch of exceptions:
    1. Farmed animal welfare: EtG maybe 2-5x more valuable, although a lot of FAW organizations that I'm excited about are for-profit, which makes this a little weird.
    2. AI Safety: EtG 0.5-2x more valuable? My impression is that the influx of new AI safety donors has come close to offsetting FTX leaving, and opportunities for direct work seem stronger than a year ago. I'm not well calibrated on whether there's more stuff to fund, and would guess that your answer to that question depends a lot on how you evaluate the well-funded labs.
    3. Meta EA: EtG 2-10x more valuable? My impression is that the aforementioned AI safety donors think EA is weird/too indirect, so funding for meta stuff has been substantially hit. Hopefully as they gain more familiarity with the community they will change their mind, but right now it feels like things are not that well-funded.
    4. GH&WB: 1.5-3x more valuable? It seems like OP might be scaling down their funding, though I don't have a lot of insight here


3. is such a great answer, thank you!

Some people doing entrepreneurship to give will invest significant time, energy, and resources into their En2G project, yet fail to generate any meaningful amount of donations. (If this weren't true, it would signal that the En2G crowd wasn't taking enough tasks.)

How can the community make this group of people feel validated and esteemed? They did something that was high EV in expectation, rather than take a "safer" but probably lower-EV salary-to-give position which would also have given them assurance of a cushy financial life. I can imagine that people in this position who invested several years in an En2G project might feel their work and sacrifices weren't appreciated because they didn't result in actual impact.

My experience is that there are a bunch of metrics about startups which correlate with the founders' skill/effort better (though not perfectly) than exit value:

  1. Peak revenue run rate (and related metrics like EBITDA)
  2. Prestigiousness of investors
  3. Prestigious of incubator
  4. Amount of money raised
  5. Number of employees

And most of these metrics are publicly available.

I actually don't know a ton of people who are in the category of "founded something that was ex-ante plausible, put multiple years into it, but it didn't work out" so I'm mostly speculating, but my somewhat limited experience is that people will usually put on their resume stuff like "founded and grew my start up to $10M/year ARR with 30 employees backed by Sequoia" and this is impressive despite them not exiting successfully.[1]

  1. ^

    Though obviously ~100% of these founders would happily exchange that line on their resume for a fat check from having sold their company.

What do you think are the best existing writings on startup earning-to-give? Other than this Huffington Post interview with you, of course.

I am obviously biased, but I do think that this post of mine is the most accurate thing I know of for predicting founder income.

80k's article is one of the few other things I know about; I think there might just not be that much written about this (although in fairness "startup earning to give advice" probably looks a lot like "startup advice", so maybe not that much actually needs to be written)

Have your thoughts on earning to give vs direct work changed (including the specific numbers) since the linked post from summer 2022?

Thanks for the question! I tried to give my quick thoughts on this here.

One related thing I'm wondering about is how important widening the set of donors is. I.e. new donors help directly because they're donating and funding useful work, and also extra donors mean that (probably?) there's a bit more stability. I'm not sure how strongly to value the second factor. 

Some folks explicitly prefer a world in which a lower proportion of money spent on EA-ish projects was from Open Philanthropy even if overall donations were the same. That seems like a sensible preference. 

I think I would value non-OP donations to a largely-OP-backed organization at 1-20% more than OP donations to them, roughly. Or heuristically: I think funding diversity is some icing on the cake if you are considering EtG, but the primary motivation I expect should be funding things which would otherwise not have been funded at all.

That seems like a very weak claim. I think for these views to have much action guiding consequence, you have to prefer $1 of non-OP money over significantly more than $1 of OP money.

Generally speaking, should tech people start startups and EtG?

For those that have done so: would you advise going for broke and trying to make those startups as big as possible? Or optimise for something more sustainable that can be exited to generate cash, and then start something else higher-risk?

The numbers in this article seem higher to me than the value I would place on most tech people doing direct work, so a naïve answer is "yes, if you can get into YCombinator you should probably do that." However, YC is extremely competitive and "being able to make a lot of money" is often correlated with "being valuable in direct work" so it's hard to make a general statement. "Spend six months starting a company and then shut down if you don't get into a top incubator" doesn't seem like crazy advice to me.

Regarding risk: returns to entrepreneurship are very fat tailed, and there are theoretical as well as empirical arguments about why we should expect this (e.g. entrepreneurs take on nondiversifiable risk, and you would expect them to need substantial additional compensation to offset this). That being said: I think the data set people use can skew these results, e.g. YCombinator intentionally invests in high risk companies, so it's unsurprising that YC founders have fat tailed results.

I don't understand the strategy of creating a lower risk business in order to fund a higher risk business though: if you are aligned with your investors (and if your goal is "make money" then you probably are aligned), then it seems strictly better to use their money instead of your own?

I don't understand the strategy of creating a lower risk business in order to fund a higher risk business though: if you are aligned with your investors (and if your goal is "make money" then you probably are aligned), then it seems strictly better to use their money instead of your own?

Second-time founders (at least in my experience, and in the UK/Europe) have a much easier time getting funding for their businesses. Certainly as a first-time founder our experience of getting funding has been like pulling teeth, despite decent traction and ARR. With greater access to capital I'd expect a higher chance of building a very large company. So in essence, the goal of one's first venture might be to just get to some form of exit to provide the cachet for then starting the next thing, rather than going for as big an exit as possible.

Ah yeah, certainly proving yourself in some way will make it easier for you to get funding.

Dumb question: have you considered immigrating to the US? The US has substantially more VC funding available than any other country.

I understand if you can't answer this or can't provide a specific answer, but could you share:

1) approximately how much your companies were acquired for, and

2) how you've donated/managed the money since then (i.e. did you donate most of it ASAP and no longer donate much, or did you invest and now donate a steady amount each year, etc).


Unfortunately I can't disclose the acquisition price. I put donations into a DAF immediately after the acquisition, for tax reasons, but I think have distributed only ~20% so far. I also calculated that it was more tax effective to cut my salary than to donate to CEA and have that money go to my salary, so I did that (and didn't put some money into the DAF as a result).

When founding something new, how do you balance money and impact?

I’m (attempting to be) an EA entrepreneur, but I find it difficult to balance finding product-market-fit+scaling with actual impact. At different times, I’ve found myself focusing too much on building something “big” that doesn’t have any actual object-level impact, or being too perfectionist about optimising impact versus simply doubling down on something that works and is influential/makes lots of money.

I haven’t figured it out and don’t expect this to be an easy answer, so just curious what your thoughts are on the resource/impact tradeoffs in decision making.

My sense is that the companies which have managed to be impactful and profitable create products which require substantial capital investment but which are profitable while still having massive positive externalities if created. Importantly "we will create X, but after having created X we will go on to create Y and Y will be good for the world" seems to have a pretty dubious track record.

(I guess this is an elaborate way of saying "impact + profit" companies work if and only if there aren't market failures.)

I have a general bias in favor of focus and simplicity, which makes me think that people should usually focus either on EtG or impact but not both, but it's really hard to give universal advice here. I think if I were you I would just BOTEC out the value of the two products you are considering, and then do the one which has the higher number.

What's your favorite less useful project? Any other projects of the mostly useless variety you're working on? :P

I think HairToHelp.us is one of my favorites because it's actually been a slightly useful project – multiple people have changed their haircut as a result!

I've recently been interested in augmented reality effects, and hope to have more of them soon :)

What should an undergrad looking to do earn to give or direct work learn in terms of skills that would be most useful in your opinion?

This is a very broad question! I guess the major debate right now is whether one should focus on a "path" (a la 80k) versus a set of "aptitudes" (a la Holden). I slightly think that aptitudes is a more useful framework if you are very early in your career, but it feels pretty close to me.

Since you offered: 0.015% tretinoin + niacinamide 4% + urea 5% moisturiser at night, SPF50 in the morning. What else should I be doing?

Seems pretty good! My mental model is something like: 80% of the value of skincare comes from sunscreen + moisturizer, 15% from vitamin A, and then everything else is either speculative or has small improvements. Some things you might want to consider to further optimize the remaining 5%, if you want:

  1. A relatively gentle exfoliant, 1-2x/week (used instead of the products you would normally apply). Alpha hydroxy acids are pretty popular and usually don't cause a terribly strong reaction (though you probably will be red for a day).
  2. Vitamin C (in the morning), particularly if you have hyperpigmentation.
  3. Microneedling has results that are roughly comparable to tretinoin, and additionally aids in product delivery (for the obvious reason that the needles poke through your epidermis). So you could consider this, although it's substantially more invasive than the other things on this list (while still being considered "minimally invasive").
  4. The three products you mentioned have slightly different pH's; possibly you are already doing this, but consider spacing out their application (e.g. tretinoin at night, niacinamide in the morning).

Do you think there could realistically be effective for-profit EA orgs, and why do you think we haven't seen any yet?

I feel like there are a bunch! [Send]Wave is maybe the most central example, but Just Foods, UPSIDE Foods, Mission Barns also qualify. Beyond Meat and Impossible Foods also I think were founded for EA-ish reasons (and e.g. Impossible received an investment from OP) though I think their founding teams are less involved with EA.

Anthropic, DeepMind, and OpenAI also arguably qualify.

I do think that "making money + having an impact" is strictly harder than "making money" though, which does present a challenge to founders.

Oops, forgot about clean meat startups! A good EA friend of mine even co-founded Gourmey 😅

I had this question cached as "outside of the cultured meat space" but forgot about the qualifier

I know this is a side issue, but calling OpenI an "EA org" seems a bit absurd at this stage to be at least, can you explain what you mean by that?

What aspect of it do you think is absurd? My impression is that most EAs would rather not take credit for OpenAI, and many OAI staff are similarly skeptical about EA, but it nonetheless is the case that EA seems to have been relatively influential in the founding of OAI, development of key technologies like RLHF, and some OAI staff are to this day motivated by EA principles. Their roles are listed on the 80k job board as one (bad) metric.

Interesting take on what is considered an "EA org", I think I was talking cross purposes a little. I agree absurd might be too strong a word.

I completely agree EA has been influential in founding OAI, and developing key technologies, and that some of the staff are motivated by EA principles, and that their roles are listed on the 80k job board (which personally I don't agree with but that's a whole nother discussion).

My (relatively uninformed) take is that EAs were at least a significant part of OpenAI's foundation and growth, but now the vast majority of EAs (I would guess 60-80%) would consider OpenAI's existence quite a significant net negative to the world and so it would no longer be considered an "EA org", even if it was at some stage.

Would be an interesting thing to do a poll on actually, @Nathan Young
"Do you consider OpenAI to be an "EA Organisation". 

also a bunch of forecasting orgs — Manifold, Metaculus, Good Judgement Project, Hypermind, etc

>Cosmetics and skincare for those who (want to) look masculine

Any good resources in general, for the obvious requirement that I don't want to look like I'm wearing makeup?

Currently all I do is moisturise 

By far the most important suggestion (unless you are optimizing for a very short timeline) is to wear sunscreen. Vitamin A (retinol, trentinoin, etc.) is the next thing I would add in if you are willing to do something every night in addition to every morning.

Regarding cosmetics: Unfortunately I don't have a great general resource; I do have a playlist of some of the videos which I made that seemed most useful to people. (Part of why I started making videos is that I couldn't find good resources for myself, but possibly that's changed in the past couple years.)

Did you sense something was amiss at Alameda?

EDIT -- Ben addresses this at length in a previous post. 

However, I think the question is germane, because if you go for the money, it has practical consequences, like daily association with people who operate under very different ethical frameworks than our own. The risk is that you'll get socialized into their worldview, as (seemingly) happened to folks at Alameda. I am wondering how Ben thinks (or thought) about that risk.

Have you already seen Ben's notes here?

Thanks! I actually had but had forgotten that Ben was its author.

I edited my top-level comment to reflect this, and to better hone the question I was really getting at.

I don't recall any discussion of things like "is it okay to steal money if that goes to good causes?" and if you were to visit the AR office during the brief time I was there you would find a huge range of dysfunctions, but not people endorsing theft. (Note that I was there before any of the things people are charged with were alleged to have occurred.)

I hear a broader version of this concern sometimes from people who believe that finance or tech more generally are bad for society, and regardless of whether that's true my experience is that the rank and file people who work in those sectors are basically pretty average people who happen to like math or programming or whatever, and I wouldn't expect more value drift from you working with them than you would from working with the average person.

(I think there's a stronger concern like "the value drift you experience from working  the average person is too strong; I want to be surrounded by people who give 90% of their income, are vegan, etc." and if that's your desire then I do suspect that earning to give is probably not right for you.)