EdoArad

EdoArad's Comments

AMA Patrick Stadler, Director of Communications, Charity Entrepreneurship: Starting Charities from Scratch

How much do you think CE can effectively grow? Are the limits for growth in promising applicants, outreach, seed funding, charity ideas, diminishing returns for training, or something else entirely?

The case for building more and better epistemic institutions in the effective altruism community

This is great! I find this extremely important, and I agree that we have a lot of room to improve. Thank you for the clear explanation and the great suggestions.

Further ideas:

  1. A global research agenda / roadmap.
  2. Bounties for specific requests. 
    1. Perhaps someone can set a (capped) 1-1 matching for individual requesters. 
    2. Better, give established researchers or organizations credit to use for their requests. 
  3. A peer review mechanism in the forum. A concrete suggestion:
    1. Users submitting a "research post" can request peer review, which is displayed in the post [a big blue "waiting for reviewers"].
    2. Reviewer volunteers to review, and present their qualifications (and statement of no conflict of interest) to a dedicated board that consists of "EA experts", which can approve them for review. 
    3. There are strict-ish guidelines on what is expected from a good post, and a guide for reviewers. 
    4. The reviewers submit their review anonymously and publicly. 
    5. They can accept the post [a big green "peer reviewed"].
    6. They can also ask to fix some errors and improve clarity [a big yellow "in revision"].
    7. They can decide that it is just not good enough or irrelevant [a big red "rejected"].
  4. (The above is problematic in several ways. The reviewer is not randomized, so there is inherent bias. The incentive for reviewing is not clear. It can be tough to be rejected..)
  5. Better norms for linking to previous research and asking for it. Better norms for suitable exposition. These norms don't have to be strict on "non-research" posts. 
  6. The forum itself can contain many further innovations (Good luck, JP!):
    1. Polls and embedded prediction tools. 
    2. Community editable wiki posts. 
    3. Suggested templates. 
    4. Automated suggestion for related posts while editing (like in stackexchange). 
    5. An EA tag on lesswrong/alignment forum (or vice versa) with which posts can be displayed on both sites (like the LW/AF workflow). 
    6. A mechanism for highlighting and commenting like in Medium. (Not sure I like it)
    7. Suggestions that appear (only) to the editor like in google docs. 
    8. There are some great stuff already on their way also :) 
  7. Regarding a wiki, Viktor Petukhov wrote a post about it with some discussion following it on the post and in private communication.  
  8. More research mentorships. Better support for researchers at the start of their path.
  9. Better expository and introductory materials, and guides to the literature. 
  10. Better norm and infrastructure for partnering.
  11. A supportive infrastructure to coordinate projects globally, between communities. This can allow more easily to set up large scale, volunteer-led projects for better epistemic institutions. The importance of local communities here is as a vetting mechanism.
What posts do you want someone to write?

Do you mind expanding a bit on CNS Imaging, Entropy for Intentional content, and Graph Traversal?

What posts do you want someone to write?

No, the analysis does not seem to contain what I was going for. 

Curious about what you think is weird in the framing?

What posts do you want someone to write?

This is not quite what I was going for, even though it is relevant. This problem profile focuses on existing institutions and on methods for collective decision making. I was thinking more in the spirit of market design, where the goal is to generate new institutions with new structures and rules so that people are selfishly incentivised to act in a way which maximizes welfare (or something else).

What posts do you want someone to write?

Governance innovation as a cause area

Many people are working on new governance mechanisms from an altruistic perspective. There are many sub-categories such as Charter cities, space governance, decentralized governance,  RadicalXChange agenda..

I'm uncertain as to the marginal value in such projects, and I'd like to see a broad analysis that can serve as a good prior and analysis framework for specific projects.

What posts do you want someone to write?

An analysis of how knowledge is constructed in the EA community, and how much weight we should assign to ideas "supported by EA". 

The recent question on reviews by non-EA researchers  is an example of that. There might be great opportunities to improve EA intellectual progress.

Insomnia with an EA lens: Bigger than malaria?

Really appreciate the work that you are putting into this app, and this write-up. I'm excited by your app, and hope that it will help a lot of people to solve their sleeping problems! John Halstead also wrote a post on CBT-i a while ago, and while I assume that you've reached it independently, it's great to see attempt at real-world solutions and impact assessments.

There are two points that I think are missing from your analysis. First, regarding Tractability, I'm curious as to what would cause people with insomnia to seek help and find the CBT app. That is, even if CBT is very effective, it might still be very hard to reach people and to put the treatment in practice.

Second, I'd like to see an assessment of the marginal contribution for Slumber over existing efforts. There seem to be other apps for CBT-i. 

Thanks again! I've suggested Slumber for a friend to try out :)

Cortés, Pizarro, and Afonso as Precedents for Takeover

Thanks for this post, fascinating read!

Considering the hypothesis given here, I'm curious as to why we don't see more takeovers today. There are countries and small corporations involved in inner conflicts that I expect (following this post) a small but powerful organisation (or a large nation) could take over. Some reasons that we may not see this - 

  1. International laws or one of the world's leading nations might punish such takeover attempt.
  2. People with position of power may not want to take that kind of risk.
  3. There is not that much economic value to gain.
  4. Takeovers may be quiet (say by blackmail).
  5. Conspiratorially, the relevant opportunities are getting picked up by the more powerful nations/corporations.
Load More