Jonas Vollmer

Bio

I’m co-founding the Atlas Fellowship, a program that researches talent search and scholarships for exceptional students.

Previously, I ran EA Funds and the Center on Long-Term Risk. My background is in medicine (BMed) and economics (MSc). See my LinkedIn.

You can best reach me at jonas@atlasfellowship.org.

I appreciate honest and direct feedback: https://admonymous.co/vollmer

Unless explicitly stated otherwise, opinions are my own, not my employer's. (I think this is generally how everyone uses the EA Forum; others who don't have such a disclaimer likely think about it similarly.)

Comments
582

Topic Contributions
2

I read his comment differently, but I'll stop engaging now as I don't really have time for this many follow-ups, sorry!

What if the investor decided to invest knowing there was an X% chance of being defrauded, and thought it was a good deal because there's still an at least (100-X)% chance of it being a legitimate and profitable business? For what number X do you think it's acceptable for EAs to accept money?

Fraud base rates are 1-2%; some companies end up highly profitable for their investors despite having committed fraud. Should EA accept money from YC startups? Should EA accept money from YC startups if they e.g. lied to their investors?

I think large-scale defrauding unsuspecting customers (who don't share the upside from any risky gambles) is vastly worse than defrauding professional investors who are generally well-aware of the risks (and can profit from FTX's risky gambles).

(I'm genuinely confused about this question; the main thing I'm confident in is that it's not a very black-and-white kind of thing, and so I don't want to make my bet about that.)

I think that's false; I think the FTX bankruptcy was hard to anticipate or prevent (despite warning flags), and accepting FTX money was the right judgment call ex ante.

I expect a 3-person board with a deep understanding of and commitment to the mission to do a better job selecting new board members than a 9-person board with people less committed to the mission. I also expect the 9-person board members to be less engaged on average.

(I avoid the term "value-alignment" because different people interpret it very differently.)

That was an example; I'd want it to exclude any type of fraud except for the large-scale theft from retail customers that is the primary concern with FTX.

I think 9-member boards are often a bad idea because they tend to have lots of people who are shallowly engaged, rather than a smaller number of people who are deeply engaged, tend to have more diffusion of responsibility, and tend to have much less productive meetings than smaller groups of people. While this can be mitigated somewhat with subcommittees and specialization, I think the optimal number of board members for most EA orgs is 3–6.

no lawyers/accountants/governance experts

I have a fair amount of accounting / legal / governance knowledge and as part of my board commitments think it's a lot less relevant than deeply understanding the mission and strategy of the relevant organization (along with other more relevant generalist skills like management, HR, etc.). Edit: Though I do think if you're tied up in the decade's biggest bankruptcy, legal knowledge is actually really useful, but this seems more like a one-off weird situation.

I would be willing to take the other side of this bet, if the definition of "fraud" is restricted to "potentially stealing customer funds" and excludes thinks like lying to investors.

You seem to imply that it's fine if some board members are not value-aligned as long as the median board member is. I strongly disagree: This seems a brittle setup because the median board member could easily become non-value-aligned if some of the more aligned board members become busy and step down, or have to recuse due to a COI (which happens frequently), or similar. 

TL;DR: You're incorrectly assuming I'm into Nick mainly because of value alignment, and while that's a relevant factor, the main factor is that he has an unusually deep understanding of EA/x-risk work that competent EA-adjacent professionals lack.

I might write a longer response. For now, I'll say the following:

  • I think a lot of EA work is pretty high-context, and most people don't understand it very well. E.g., when I ran EA Funds work tests for potential grantmakers (which I think is somewhat similar to being a board member), I observed that highly skilled professionals consistently failed to identify many important considerations for deciding on a grant. But, after engaging with EA content at an unusual level of depth for 1-2 years, they can improve a lot (i.e., there were some examples of people improving their grantmaking skills a lot). Most such people never end up attaining this level of engagement, so they never reach the level of competence I think would be required.
  • I agree with you that too much of a focus on high status core EAs seems problematic.
  • I think value-alignment in a broader sense (not tracking status, but actual altruistic commitment) matters a great deal. E.g., given the choice between personal prestige and impact, would the person reliably choose the latter? I think some high-status core EAs who were on EA boards were not value-aligned in this sense, and this seems bad.

 

EDIT: Relevant quote—I think this is where Nick shines as a board member:

For example, if a nonprofit's mission is "Help animals everywhere," does this mean "Help as many animals as possible" (which might indicate a move toward focusing on farm animals) or "Help animals in the same way the nonprofit traditionally has" or something else? How does it imply the nonprofit should make tradeoffs between helping e.g. dogs, cats, elephants, chickens, fish or even insects? How a board member answers questions like this seems central to how their presence on the board is going to affect the nonprofit.

Load more