M

MichaelDickens

5388 karmaJoined

Bio

I do independent research on EA topics. I write about whatever seems important, tractable, and interesting (to me). Lately, I mainly write about EA investing strategy, but my attention span is too short to pick just one topic.

I have a website: https://mdickens.me/ Most of the content on my website gets cross-posted to the EA Forum.

My favorite things that I've written: https://mdickens.me/favorite-posts/

I used to work as a software developer at Affirm.

Sequences
1

Quantitative Models for Cause Selection

Comments
775

I can't understand ~anything this post is trying to say.

  • It uses many terms that I've never heard before, and doesn't define them.
  • It makes references to concepts and seems to be trying to imply something with them, but I don't know what. For example, it references two historical case studies, but I don't get what I'm supposed to be learning from those case studies.

Caring about existential risk does not require longtermism, but existential risk being the top EA priority probably requires longtermism or something like it. Factory farming interventions look much more cost-effective in the near term than x-risk interventions, and GiveWell top charities look probably more cost-effective.

Over the years I've written some posts that are relevant to this week's debate topic. I collected and summarized some of them below:

"Disappointing Futures" Might Be As Important As Existential Risks

The best possible future is much better than a "normal" future. Even if we avert extinction, we might still miss out on >99% of the potential of the future.

Is Preventing Human Extinction Good?

A list of reasons why a human-controlled future might be net positive or negative. Overall I expect it to be net positive.

On Values Spreading

Hard to summarize but this post basically talks about spreading good values as a way of positively influencing the far future, some reasons why it might be a top intervention, and some problems with it.

I just had a brief look at Endaoment, later I will do a more careful look and update my post but here's what I noticed:

  • Endaoment appears to do some fancy stuff that other DAF providers don't do and it's very crypto-focused.
  • It looks like there are no AUM fees, but it charges 0.5% on deposits and 1% on donations, which is more expensive than a normal DAF if your DAF has high turnover, and cheaper than a normal DAF if you only donate once every few years or so.

I was not aware of Endaoment, I will look into it!

I don't know if any DAF providers support direct giving. But any provider should let you give stock to your DAF and then donate it to charity a few days later.

In terms of fees, if you only use your DAF as a convenient way to donate stock and you mostly maintain a $0 balance, then you'll just have to pay the minimum fee. I listed the minimum fees here—I think your best bet is Schwab because it has no minimum fee. Charityvest has no minimum for cash-only accounts, and I think you can still contribute stock to a cash-only account (they'll just liquidate the stock once they get it), but I'm not sure about that so you might want to ask them.

Another important consideration in favor of giving now—if you earn a steady income—is that your donations this year only represent a small % of your lifetime giving.

In fact, if you think the giving-now arguments strongly outweigh giving-later but you expect to earn most of your income in the future, then it might make sense to borrow money to donate and repay the loans out of future income. But that's difficult in practice.

I think the tendency to write unconstructive criticisms (at least for me) comes from the combination of:

  1. I have a strong urge to comment on anything that looks incorrect
  2. Writing substantive criticisms of a post (often) requires grokking the whole post and thinking deeply about it, which is hard. Criticizing some specific sentence is easy because my brain instantly surfaces the criticism when I read the sentence

I would like to publicly set a goal not to comment other people's posts with a criticism of some minor side point that doesn't matter. I have a habit of doing that, but I think it's usually more annoying than it is helpful so I would like to stop. If you see me doing it, feel free to call me out

(I reserve the right to make substantive criticisms of a post's central arguments)

Another person on this forum named Michael emailed me saying he was personally willing to fund the silo alone

For the record, I did not say this. The most relevant quotes from my email are

I saw your recent posts on the EA Forum and I am thinking about the cost-effectiveness of your plan to build a grain drying / processing plant.

and

Given the considerable expense of building a processing facility, why don't you want to start with a storage silo, which would be much cheaper, and expand from there?

You may have interpreted the 2nd quote as an offer to fund the silo, which it wasn't. If that's what happened then I should have communicated better.

This is an excellent UI, it eliminated the trivial inconvenience and significantly increased the probability that I would actually write a letter (which I just did).

Load more