bullfinch076

Your average avian community member
49 karmaJoined

Comments
3

Karma is not a straightforward signal of the value of contributions

"We already have the solution to bad contributions. It's called 'downvoting'."

This statement and the idea of karma as the decentralized solution to the problems OP describes feels overconfident to me. To reference this comment, I also would push back on karma not being subject to social desirability bias (ex: someone sees a post already has relatively high karma, so they’re more inclined to upvote it knowing that others on the Forum or in the EA community have, even if they, let's say, haven't read the whole post).

I would argue that karma isn’t a straightforward or infallible signal of “bad” or “good” contributions. As those working on the Forum have discussed in the past, karma can overrate certain topics. It can signal interest from a large fraction of the community, or “lowest-common-denominator” posts, rather than the value or quality of a contribution. As a current Forum staff member put it, “the karma system is designed to show people posts which the Forum community judges as valuable for Forum readers.”

I would note, though, that karma also does not straightforwardly represent the opinions of the Forum community as a whole regarding what’s valuable. The recent data from the 2023 EA Forum user survey shows that a raw estimate of 46.5% of those surveyed and a weighted estimate of 70.9% of those surveyed upvoted or downvoted a post or comment. Of 13.7k distinct users in a year, 4.4k of those are distinct commenters, and only 171 are distinct post authors. Engagement across users is “quite unequal,” and a small number of users create an outsized amount of comments, posts, and karma. Weighted upvotes and downvotes also mean that certain users can have more influence on karma than others. 

I appreciate the karma system and its values (of which there are several!), and maybe your argument is that more people should vote and contribute to the karma system. I just wanted to point out how karma seems to currently function and the ways it might not directly correlate with value, which brings me to my next point…
 

Karma seems unlikely to address the concerns the OP describes

Without making a claim for or against the OP’s proposed solutions, I’m unsurprised by their proposal for a centralized approach. One argument against relying on a mechanism like karma, particularly for discussions of race on the Forum, is that it hasn't been a solution for upholding the values or conditions I think the OP is referencing and advocating for (like not losing the potential involvement of people who are alienated by race science, engaging in broader intellectual diversity, and balancing the implications of truth-seeking with other values). 

To give an example: I heard from six separate people involved in the EA community that they felt alienated by the discussions around Manifest on the Forum and chose to not engage or participate (and for a few people, that this was close to a last straw for them wanting to remain involved in EA at all). The costs and personal toll for them to engage felt too high, so they didn't add their votes or voices to the discussion. I've heard of this dynamic happening for different race-related discussions on the Forum in the past few years, and I suspect it leads to some perspectives being more represented on the Forum than others (even if they might be more balanced in the EA community or movement as a whole). In these situations, the high karma of some topically related comments or posts in fact seemed to further some of the problems OP describes. 

I respect and agree with wanting to maintain a community that values epistemic integrity. Maybe you think that costs incurred by race science discussions on the Forum are not costly enough for the Forum to ban discussion of the topic, which is an argument to be made. I would be curious for what other ideas or proposals you would have for addressing some of the dynamics OP describes, or thoughts on the tradeoffs between allowing/encouraging discussions of race science in EA-funded spaces and the effects that can have on the community or the movement. 

The post you linked to from Will MacAskill ("The history of the term 'effective altruism'" from 2014) doesn't reference the Rationality community (and the other links you included are to posts or pages that aren't from Will or Toby, but by Jacy Reese Anthis and some wiki-style pages). 

Do you have examples or links to talks or posts on EA history from Toby and Will that do discuss the Rationality community? (I'd be curious to read them. Thanks!)

  1. Why is the escrow deposit still sitting somewhere? Some quick online research (so take it with a grain of salt) makes it sound like the escrow process usually takes 4 to 8 weeks in California—so this seems significantly long, in comparison.
  2. Can you clarify when you received these grants and the escrow money? The complaint filed by FTX (documents here, for anyone interested) have the dates of transfers as March 3, July 8, July 13, August 18, September 20, and October 3, all in 2022—so well within the timeframe that might be subject to clawbacks, and well within the bankruptcy lookback period. (For a comparison point, EV US and EV UK paid the FTX estate an amount equal to all the funds the entities received in 2022.)
  3. Why would you not proactively return this money or settle with the FTX estate, given the money came from FTX and could have been originally obtained in fraudulent ways? My prior is that you (Oliver Habryka) have written multiple times on the Forum about the harm EA may have caused related to FTX and wish it could have been prevented, so somehow it seems strange to me that you wouldn't take the opportunity to return money that came from FTX, especially when it could have been obtained in harmful, unethical ways. 
  4. Did you in fact ignore FTX's attempts to contact you in 2023, as the complaint says? And if so, why?

I also think it's worth pointing out that in bankruptcy cases, especially regarding clawbacks, the question of whether you have a legal obligation to return the money isn't a question of whether you currently have the $5M of FTX money sitting around or whether you've already allocated or used it. Demonstrating that you've spent the funds on legitimate charitable activities might strengthen your case, but that doesn't guarantee protection from clawback attempts.