576Joined Sep 2015


Sorted by New


Topic Contributions

You're massively underestimating your ROI, probably by an order of magnitude. $10 billion in charitable contributions per year, even with a very steep discount rate of 20%, would be an ROI of, not 18-fold, but closer to 90-fold (with a net present value of $50 billion). With a more reasonable discount rate of 10% (would have said 5%, but then the Fed happened), you're talking about 180-fold returns.

Of course, this falls apart under sufficiently short timelines.

I don't think that any of those justify not sending either your questions or a writeup of the post to the org in advance. They have a public email address. It's at the bottom of their home page. I don't think it's a particularly excessive burden to send a copy once you're done and give them a week. Perhaps two if they apologize and ask for a bit more time. I understand why people might be suspicious at the moment, but forcing people to scramble while on vacation is not a good norm. As you say, this post clearly wasn't that time-sensitive. I don't think that the Forum should have taken your post down, but that's a much higher bar.

For comparison, when I posted a piece that was somewhat critical of CEA's admissions and transparency policies, it was after I had asked in a more private Slack channel and gotten an answer I was not satisfied with. You can see that they clarified that they did inform people, and that others chimed in to thank me for informing them with the post.

I am not speaking for the DoD, the US government, or any of my employers.

I think that your claim about technological inevitability is premised on the desire of states to regulate key technologies, sometimes mediated by public pressure. All of the examples listed were blocked for decades by regulation, sometimes supplemented with public fear, soft regulation, etc. That's fine so long as, say, governments don't consider advancements in the field a core national interest. The US and China do, and often in an explicitly securitized form.

Quoting CNAS

China’s leadership – including President Xi Jinping – believes that being at the forefront in AI technology is critical to the future of global military and economic power competition.

English-language Coverage of the US tends to avoid such sweeping statements, because readers have more local context, because political disagreement is more public, and because readers expect it.

But the DoD in the most recent National Defense Strategy identified AI as a secondary priority. Trump and Biden identified it as an area to maintain and advance national leadership in. And, of course, with the US at the head they don't need to do as much in the way of directing people, since the existing system is delivering adequate results.

Convincing the two global superpowers not to develop militarily useful technology while tensions are rising is going to be the first time in history that has ever been accomplished.

That's not to say that we can't slow it down. But AI very much is inevitable if it is useful, and it seems like it will be very useful.

Someone might be out about being bi at an after-party with friends, but not want to see that detail being confirmed by a fact-checker for a national paper. This doesn't seem particularly unusual.

This isn't the only thing that could go wrong, but it's a straightforward example. Perhaps they don't want their full name blatantly linked to their online account. There are lots of reasons that people might want privacy. Unless your life is at risk, I would not assume that you have privacy from a journalist who isn't a personal friend unless they have an explicit commitment. I trust journalists who are also community members to not take harmful advantage of access.

Something that is sometimes not obvious to people not used to dealing with journalists is that off-the-record sometimes means "I can't officially tell you this, so please find another source who can corroborate it". It's not remotely the same thing as an expectation of privacy and good sense that one would have with a friend.

Without getting too far into the specifics, I think that this is a good attitude to have across a wide range of policy concerns, and that similar issues apply to other policy areas EAs are interested in.

Bay Area 2023. Will edit.

Some post-EAG thoughts on journalists

For context, CEA accepted at EAG Bay Area 2023 a journalist who has at times written critically of EA and individual EAs, and who is very much not a community member. I am deliberately not naming the journalist, because they haven't done anything wrong and I'm still trying to work out my own thoughts.

On one hand, "journalists who write nice things get to go to the events, journalists who write mean things get excluded" is at best ethically problematic. It's very very very normal: political campaigns do it, industry events do it, individuals do it. "Access journalism" is the norm more than it is the exception. But that doesn't mean that we should. One solution is to be very very careful about maintaining the differentiation between "community member" and "critical or not". Dylan Matthews is straightforwardly an EA and has reported critically on a past EAG: if he was excluded for this I would be deeply concerned.

On the other hand, I think that, when hosting an EA event, an EA organization has certain obligations to the people at that event. One of them is protecting their safety and privacy. EAs who are journalists can, I think, generally be relied upon to be fair and to respect the privacy of individuals. That is not a trust I extend to journalists who are not community members: the linked example is particularly egregious, but tabloid reporting happens.

EAG is a gathering of community members. People go to advance their goals: see friends, network, be networked at, give advice, get advice, learn interesting things, and more. In a healthy movement, I think that EAGs should be a professional obligation, good for the individual, or fun for the individual. It doesn't have to be all of them, but it shouldn't harm them on any axis.

Someone might be out about being bi at an after-party with friends, but not want to see that detail being confirmed by a fact-checker for a national paper. This doesn't seem particularly unusual. They would be right to trust community members, but might not realize that there could be journalists at the after-party. Non-community journalists will not necessarily share norms about privacy or have particularly strong incentives to follow any norms that do exist.

On the gripping hand, it feels more than a little hypocritical to complain about the low quality of criticism of EA and also complain when a journalist wants to attend an EA event to get to know the movement better.

One thing I'm confident of is that I wish that this had been more clearly disclosed. "This year we are excited to welcome X, who will be providing a critical view on EA" is good enough to at least warn people that someone whose bio says that they are interested in 

how the wealthiest people in society spend their money or live their lives

(emphasis mine)

is attending.

I'm still trying to sort out the rest of my views here. Happy to take feedback. It's very possible that I'm missing some information about this.


I have been told by someone at CEA that all attending journalists have agreed that everything at EAG is off the record by default. I don't consider this to be an adequate mitigating factor for accepting non-community journalists and not mentioning this to attendees or speakers.


And no, I'm not using a pseudonym for this. I think that that is a bad and damaging trend on the Forum, and I don't, actually, believe that anyone at CEA will retaliate against me for posting this.

That seems almost aggressively misleading. "Some of this category of debt may have been held by these descendants, therefore it should have been invalidated", as you seem to be implying, proves far too much.

Bad Things Are Bad: A Short List of Common Views Among EAs

  1. No, we should not sterilize people against their will.
  2. No, we should not murder AI researchers. Murder is generally bad. Martyrs are generally effective. Executing complicated plans is generally more difficult than you think, particularly if failure means getting arrested and massive amounts of bad publicity.
  3. Sex and power are very complicated. If you have a power relationship, consider if you should also have a sexual one. Consider very carefully if you have an power relationship: many forms of power relationship are invisible, or at least transparent, to the person with power. Common forms of power include age, money, social connections, professional connections, and almost anything that correlates with money (race, gender, etc). Some of these will be more important than others. If you're concerned about something, talk to a friend who's on the other side of that from you. If you don't have any, maybe just don't.
  4. And yes, also, don't assault people.
  5. Sometimes deregulation is harmful. "More capitalism" is not the solution to every problem.
  6. Very few people in wild animal suffering  think that we should go and deliberately destroy the biosphere today.
  7. Racism continues to be an incredibly negative force in the world. Anti-black racism seems pretty clearly the most harmful form of racism for the minority of the world that lives outside Asia.[1]
  8. Much of the world is inadequate and in need of fixing. That EAs have not prioritized something does not mean that it is fine: it means we're busy.
  9. The enumeration in the list, of certain bad things, being construed to deny or disparage other things also being bad, would be bad.

Hope that clears everything up. I expect with 90% confidence that over 90% of EAs would agree with every item on this list.

  1. ^

    Inside, I don't know enough to say with confidence. Could be caste discrimination, could be ongoing oppression of non-Han, could be something I'm not thinking of. I'm not making a claim about the globe as a whole because I haven't run the numbers, and different EAs will have different values and approaches to how to weight history, cultures, etc. I just refuse to fall into the standard America/Euro-centric framework.

Thank you. It's hard for me (and I think for many people) to remember to say what feels obvious to them. 

Load more