# All Posts

Sorted by Magic (New & Upvoted)

# Monday, April 19th 2021Mon, Apr 19th 2021

Personal Blogposts
Shortform
2Nathan_Barnard2dI think empirical claims can be discriminatory. I was struggling with how to think about this for a while, but I think I've come to two conclusions. The first way I think that empirical claims can be discrimory is if they express discriminatory claims with no evidence, and people refusing to change their beliefs based on evidence. I think the other way that they can be discriminatory is when talking about the definitions of socially constructed concepts where we can, in some sense and in some contexts, decide what is true.

# Sunday, April 18th 2021Sun, Apr 18th 2021

Personal Blogposts
Shortform
11Khorton3dI regularly see people write arguments like "One day, we'll colonize the galaxy - this shows why working on the far future is so exciting!" I know the intuition this is trying to trigger is bigger = more impact = exciting opportunity. The intuition it actually triggers for me is expansion and colonization = trying to build an empire = I should be suspicious of these people and their plans.
7anonysaurus30k3dNB: I have my own little archive of EA content and I got an alert that several links popped up as dead - typically I would just add it to a task list and move on… but I was surprised to see Joe’ Rogan’s (full) interview with Will Macaskill in 2017 was no longer available on YouTube. So I investigated and found out Rogan recently sold his entire catalog [https://www.digitalmusicnews.com/2021/04/06/joe-rogan-spotify-removing-shows/] and future episodes to Spotify (for $100 million!). Currently Spotify is removing episodes from other platforms like Apple, Youtube and Vimeo. They’ve also decided to not transfer certain episodes [https://www.digitalmusicnews.com/2021/03/30/spotify-joe-rogan-episodes-removed/] that violate their platform’s rules on content (i.e. it’s controversial or offensive). I was a little alarmed that Will’s interview might be on the cut-list but alas it still exists on Spotify [https://open.spotify.com/episode/7KGozS19cvAfpv80ermY5q], but you now have to make a (free) account to access it. # Saturday, April 17th 2021Sat, Apr 17th 2021 Shortform 5Harrison D5dEA (forum/community) and Kialo? TL;DR: I’m curious why there is so little mention of Kialo [https://www.kialo.com/] as a potential tool for hashing out disagreements in the EA forum/community, whereas I think it would be at least worth experimenting with. I’m considering writing a post on this topic, but want to get initial thoughts (e.g., have people already considered it and decided it wouldn’t be effective, initial impressions/concerns, better alternatives to Kialo) The forum and broader EA community has lots of competing ideas and even some direct disagreements. Will Bradshaw's recent comment [https://forum.effectivealtruism.org/posts/5iCsbrSqLyrfP55ry/concerns-with-ace-s-recent-behavior-1?commentId=iGD3xNKSyLK7Z8RHu] about discussing cancel culture on the EA forum is just the latest example of this that I’ve seen. I’ve often felt that the use of a platform like Kialo [https://www.kialo.com/] would be a much more efficient way of recording these disagreements, since it helps to separate out individual points of contention and allow for deep back-and-forth, among many other reasons. However, when I search for “Kialo” in the search bar on the forum, I only find a few minor comments mentioning it (as opposed to posts) and they are all at least 2 years old. I think I once saw a LessWrong post downplaying the platform, but I was wondering if people here have developed similar impressions. More to the point, I was curious to see if anyone had any initial thoughts on whether it would be worthwhile to write an article introducing Kialo and highlighting how it could be used to help hash out disagreements here/in the community? If so, do you have any initial objections/concerns that I should address? Do you know of any other alternatives that would be better options (keeping in mind that one of the major benefits of Kialo is its accessibility)? 3RogerAckroyd4dSometimes the concern is raised that caring about wild animal welfare is seen as unituitive and will bring conflict with the environmental movement. I do not think large-scale efforts to help wild animals should be an EA cause at the moment, but in the long-term I don't think environmentalist concerns will be a limiting factor. Rather, I think environmentalist concerns are partially taken as seriously as they are because people see it as helping wild animals as well. (In some perhaps not fully thought out way.) I do not think it is a coindince that the extinction of animals gets more press than the extinction of plants. I also note that bird-feeding is common and attracts little criticism from environmental groups. Indeed, during a cold spell this winter I saw recommendations from environmental groups to do it. # Thursday, April 15th 2021Thu, Apr 15th 2021 Shortform 11evelynciara6dOn the difference between x-risks and x-risk factors I suspect there isn't much of a meaningful difference between "x-risks [https://forum.effectivealtruism.org/tag/existential-risk]" and "x-risk factors [https://forum.effectivealtruism.org/tag/existential-risk-factor]," for two reasons: 1. We can treat them the same in terms of probability theory. For example, ifX is an "x-risk" andYis a "risk factor" forX, thenPr(X∣Y)>Pr(X). But we can also say thatPr(Y∣X)>Pr(Y), because both statements are equivalent toPr(X,Y) >Pr(X)Pr(Y). We can similarly speak of the total probability of an x-risk factor because of the law of total probability [https://en.wikipedia.org/wiki/Law_of_total_probability] (e.g.Pr(Y)=Pr(Y∣X1) +Pr(Y∣X2)+…) like we can with an x-risk. 2. Concretely, something can be both an x-risk and a risk factor. Climate change is often cited as an example: it could cause an existential catastrophe directly by making all of Earth unable to support complex societies, or indirectly by increasing humanity's vulnerability to other risks. Pandemics might also be an example, as a pandemic could either directly cause the collapse of civilization or expose humanity to other risks. I think the difference is that x-risks are events that directly cause an existential catastrophe, such as extinction or civilizational collapse, whereas x-risk factors are events that don't have a direct causal pathway to x-catastrophe. But it's possible that pretty much all x-risks are risk factors and vice versa. For example, suppose that humanity is already decimated by a global pandemic, and then a war causes the permanent collapse of civilization. We usually think of pandemics as risks and wars as risk factors, but in this scenario, the war is the x-risk because it happened last... right? One way to think about x-risks that avoids this problem is that x-risks can have both direct and indirect causal pathways to x-catastrophe. # Wednesday, April 14th 2021Wed, Apr 14th 2021 Shortform 20Pablo7dScott Aaronson just published a post [https://www.scottaaronson.com/blog/?p=5448] announcing that he has won the ACM Prize in Computing and the$250k that come with it, and is asking for donation recommendations. He is particularly interested "in weird [charities] that I wouldn’t have heard of otherwise. If I support their values, I’ll make a small donation from my prize winnings. Or a larger donation, especially if you donate yourself and challenge me to match." An extremely rough and oversimplified back-of-the-envelope calculation [https://www.getguesstimate.com/models/18118] suggests that a charity recommendation will cause, in expectation, ~$500 in donations to the recommended charity (~$70–2800 90% CI).
14evelynciara7d"Quality-adjusted civilization years" We should be able to compare global catastrophic risks in terms of the amount of time they make global civilization significantly worse and how much worse it gets. We might call this measure "quality-adjusted civilization years" (QACYs), or the quality-adjusted amount of civilization time that is lost. For example, let's say that the COVID-19 pandemic reduces the quality of civilization by 50% for 2 years. Then the QACY burden of COVID-19 is0.5×2=1 QACYs. Another example: suppose climate change will reduce the quality of civilization by 80% for 200 years, and then things will return to normal. Then the total QACY burden of climate change over the long term will be0.8×200=160QACYs. In the limit, an existential catastrophe would have a near-infinite QACY burden.