Quick takes

Show community
View more
Bill Gates: "My new deadline: 20 years to give away virtually all my wealth" - https://www.gatesnotes.com/home/home-page-topic/reader/n20-years-to-give-away-virtually-all-my-wealth
I wonder what can be done to make people more comfortable praising powerful people in EA without feeling like sycophants. A while ago I saw Dustin Moskovitz commenting on the EA Forum. I thought about expressing my positive impressions of his presence and how incredible it was that he even engaged. I didn't do that because I feared it would look like sycophancy. The next day he deleted his account. I don't think my comment would have changed anything in that instance, but I still regretted not commenting. In general, writing criticism feels more virtuous than writing praise. I used to avoid praising people who had power over me, but now that attitude seems misguided to me. While I'm glad that EA provided an environment where I could feel comfortable criticising the leadership, I'm unhappy about ending up in a situation where occupying leadership positions in EA feels like a curse to potential candidates. Many community members agree that there is a leadership vacuum in EA. That should lead us to believe people in leadership positions should be rewarded more than they currently are. Part of that reward could be encouragement and I am personally committing to comment on things I like about EA more often.
My Google alert for EA flagged an interesting article: 'Visionaries and Crackpots, Maniacs and Saints: Existential Risk and the Politics of Longtermism', recently published in Ratio, a peer-reviewed academic journal of analytic philosophy. Since the author is a Research Affiliate and Former Research Associate at CSER, I thought they might have a reasonable and well-informed critique, and I was excited to read it!  Unfortunately, a couple of claims in the article made me question the entire piece.  For example, here is how the author characterises Ord's Long Reflection, first in the abstract: And then later in the body of the article: And here is the relevant section of Ord's The Precipice:
Here are my rules of thumb for improving communication on the EA Forum and in similar spaces online: * Say what you mean, as plainly as possible. * Try to use words and expressions that a general audience would understand. * Be more casual and less formal if you think that means more people are more likely to understand what you're trying to say. * To illustrate abstract concepts, give examples. * Where possible, try to let go of minor details that aren't important to the main point someone is trying to make. Everyone slightly misspeaks (or mis... writes?) all the time. Attempts to correct minor details often turn into time-consuming debates that ultimately have little importance. If you really want to correct a minor detail, do so politely, and acknowledge that you're engaging in nitpicking. * When you don't understand what someone is trying to say, just say that. (And be polite.) * Don't engage in passive-aggressiveness or code insults in jargon or formal language. If someone's behaviour is annoying you, tell them it's annoying you. (If you don't want to do that, then you probably shouldn't try to communicate the same idea in a coded or passive-aggressive way, either.) * If you're using an uncommon word or using a word that also has a more common definition in an unusual way (such as "truthseeking"), please define that word as you're using it and — if applicable — distinguish it from the more common way the word is used. * Err on the side of spelling out acronyms, abbreviations, and initialisms. You don't have to spell out "AI" as "artificial intelligence", but an obscure term like "full automation of labour" or "FAOL" that was made up for one paper should definitely be spelled out. * When referencing specific people or organizations, err on the side of giving a little more context, so that someone who isn't already in the know can more easily understand who or what you're talking about. For example, instead of just saying "MacAskill" or "Will", say "Wi
22
NickLaing
2d
12
Rutger Bregman is taking the world by storm at the moment, promoting his book and concept "Moral Ambition". Yesterday he was on the Daily show!. It might be the biggest wave of publicity of largely EA ideas since FTX? Most of what he says is an attractive repackaging, or perhaps an evolution of largely EA ideas. He's striking a chord with the mainstream media, in a way that I'm not sure Effective altruism ever really has (but I wasn't there in the early days). I would also hazard a guess that his approach might resonate especially well with left-leaning people. I was wondering if there's anything EA's could be DOING at the moment to take advantage of/leverage this unexpected wave of EA-Adjacent publicity. Things like... 1. Help with funding advertising, or anything else he might needs to ride the wave - these opportunities don't come often. He may well not need money though... 2. Using his videos and ideas as "ins" or advertising to university EA groups or other outreach. I know he's going to talk at Harvard uni soon - what is the group there's response? 3. Incorporating some of his language and ideas into how EA presents itself. Phrases like "Moral ambition", and the "Bermuda triangle of talent" seem like great phrases to adopt into our "lexicon" as it were. Thoughts?