If you're new to the EA Forum, consider using this thread to introduce yourself! 

You could talk about how you found effective altruism, what causes you work on and care about, or personal details that aren't EA-related at all. 

(You can also put this info into your Forum bio.)


If you have something to share that doesn't feel like a full post, add it here! 

(You can also create a Shortform post.)


Open threads are also a place to share good news, big or small. See this post for ideas.

19

0
0

Reactions

0
0
Comments17
Sorted by Click to highlight new comments since: Today at 3:34 PM

I've been having some mixed feelings about some recent initiatives in the Forum.

These include things in the space of the creative fiction contest, posting humorous top level content and asking people to share memes.

I am having trouble articulating exactly what is causing my uneasiness. I think its something along the lines of "I use the EA Forum to stay up to date on research, projects and considerations about Effective Altruism. Fun content distracts from that experience,  and makes it harder for the work I publish in the Forum to be taken seriously".

On the other hand, I do see the value of having friendly content around. It makes the community more approachable. And the last thing I would want is to gatekeep people out for wanting to have fun together. I love hanging out with EAs too!

I trust the leadership of the Forum to have thought about these and other considerations. But I am voicing my opinion in case there are more who also share this uneasiness, to see if we can pinpoint it and figure out what to do about it.

Things that I think would help mitigate my uneasiness:

  • Create a peer-reviewed forum on top of the EA Forum, which curates research/thoughful content. An interface like the Alignment Forum / LessWrong would work well for this.
  • Create a separate place of discourse (a Facebook group?) for fun content, perhaps linked somehow from the EA Forum.
  • Have the fun content be hidden by default, like personal posts, so people need to opt into it.

What do other people think? Do other people feel this way? 

Thanks for voicing these concerns! You've articulated a not-uncommon point of view on how the Forum ought to be used, and one that we try to incorporate into our work alongside many other points of view.

I've heard some people express a desire for the Forum to look more like a peer-reviewed journal. I've heard even more express concerns in the opposite direction — that the site feels like it has a very high bar for engagement, and any content other than serious research seems suitable only for Facebook (many of those people are trying to use Facebook less or not at all).

Other people have told me that they just really enjoy creative writing, art, jokes, etc., and want the Forum to represent that side of EA culture. Sometimes, the creative work is a big part of what drew them to the movement in the first place.

I think that examples like "The Fable of the Dragon-Tyrant" show that more "creative" EA content clearly has a place in the movement, and that we'd be better off with more stories of that quality. Hence, the writing contest. 

Just as not all research on the Forum is as strong as e.g. that of Rethink Priorities, not all stories will be cultural touchstones that stand the test of time. Still, I think the gems are worth having a lot of rougher content show up.

The encouragement for people to share their work in public (rather than quietly submitting it through a form) is partly in response to feedback about the Forum's "high bar", and partly to encourage more representation for that side of EA culture. I want to encourage people to share their work and not worry as much about whether something "qualifies" to be here.

(Some of the Forum's best posts have started with an author emailing me to say something like "I don't know if this is a good fit, but I figured I would check". I don't know how many additional posts we miss because people give up without asking me.)

***

As for the Clickhole and meme posts — both of these happened as a result of my thinking about "EA art" a bunch as I worked on the contest, but I understand that having everything appear back-to-back could create a sense of unease!

I don't expect this to be a rising trend past the time of the creative writing contest — this was just a chance for me to share a couple of things I'd drafted or thought of long ago.

I think that humorous/creative posts have a place on the front page, which is explicitly about "relatedness to EA" rather than metrics of seriousness or "quality".  That said, the meme post now has negative karma notwithstanding my default vote, so people seem to agree that it's not a good fit; I've moved it to personal blog given that feedback.

I still stand by the Clickhole post being a genuinely good piece on the importance of cause prioritization, and most people seemed to like it.

*****

Thanks for suggesting concrete actions! Here are my thoughts:

Create a peer-reviewed forum on top of the EA Forum, which curates research/thoughtful content. An interface like the Alignment Forum / LessWrong would work well for this.

Are the Alignment Forum and LessWrong "peer-reviewed" in any sense that the Forum isn't? The former has limits on who can post in the first place, but that doesn't seem like the same thing. (I may be unaware of some peer-review policy on one or both sites, though.)

We've had some internal discussions about what the EA Forum's equivalent(s) of the Alignment Forum might be, and it's very possible that we'll eventually produce a space for curated research content. We're reaching the end of our current "Forum year" (September 2020 - September 2021) and considering new initiatives we may launch for 2022; this is on the list of possibilities.

Create a separate place of discourse (a Facebook group?) for fun content, perhaps linked somehow from the EA Forum.

I like Facebook, but a lot of people really don't, and that site continues to be difficult to search, filter, etc. I think the Forum has useful features that people who like "fun content" should also be able to use.

More generally, we think of the Forum's purpose as "the center of EA discussion online". Not just research, but also community building, events, announcements, AMAs, short stories, and April Fool's jokes. All of these things seem like they help communities grow and flourish.

That said, I understand the concern about whether it makes sense to have everything presented in a single feed. That's why we've been building up our tag infrastructure and encouraging people to use filters — rather than present everything to everyone, we think it's better to let people choose what they want to see. But I don't think that has to mean separate websites.

Have the fun content be hidden by default, like personal posts, so people need to opt into it.

Is this a better option than "show this by default, and let people opt out of it"?

Personal posts can literally be about anything, as long as they don't violate our rules. Humorous posts that aren't EA-related, or posts authors would prefer be less visible, are hidden by the default personal filter. Filtering out a subset of EA-related posts based on our assumption that most people don't want to see them seems like a bigger step.

For some context, here’s a sample of what I’ve been working on recently for the Forum, outside of the creative writing contest:

  • Working to set up five new AMAs (uncertain how many will end up coming together) with serious thinkers
  • Presenting to the Stanford Existential Risk Initiative on the topic of “taking your summer research project and posting it on the Forum”, then helping lots of individual researchers (~10 so far) prepare to do so
  • Helping Holden Karnofsky crosspost his Cold Takes content to the Forum so that it’s available as soon as he publishes the blog versions
  • Continuing to help other people with content they submit for feedback (steady stream of 1-2 people per week)
  • Continuing to refine the Forum’s version of the EA Handbook (and serving as a facilitator for three live Virtual Program cohorts to get more input on how people experience the “official” version of this material, which has helped me improve the Forum version)
  • Creating a PR FAQ for a new metrics feature that should inspire more sharing of “serious work” by its authors, hopefully drawing more attention to it
  • Adding a lot more recent Forum content to CEA’s social media feeds, so that the best material (almost always serious work) reaches more readers. We’d been on a social media hiatus until ~two months ago, but engagement since we returned has been great!

While the creative writing contest is very visible, the vast majority of my time (as the main person trying to solicit more content for the Forum) goes towards helping people with serious work, and promoting said work.

I realize this may not speak to your point about uneasiness — I’m just sharing it for some context on the Forum’s overall trajectory and what CEA is trying to do with it.

I thought the Clickhole post was both funny, and a good illustration of how cause prioritization can be perceived by many people. 

I  agree with you—I generally come to the forum looking for more thoughtful content, and there are already several EA Facebook groups for which at least the meme post would have been more appropriate. I think the writing contest is probably fine though. 

I think someone should offer a prize for thoughtful responses to the "Most Important Century" series. I think it's important for someone to point out flaws in the arguments since many people will be relying on it as their gateway to longtermist EA.

Our Mayo Clinic team featured Charity Entrepreneurship alumna and Giving What We Can member Dr. Lucia Coulter and the Lead Exposure Elimination Project on our blog several weeks ago. 

Links here, for those interested!

On another global health note, eligible U.S. residents may be interested in these 10+ newly posted roles on USAID's COVID-19 Task Force. The Task Force has some fantastic people on it — and a friend at USAID has enjoyed the pace at the agency under Samantha Power.

Roles and application instructions here: https://www.usajobs.gov/GetJob/ViewDetails/614447700

Hey oh! Long time lurker, first time poster.

Finally got those nervous jitters out of the way today and actually published my first post (awaiting "first-post approval"), along with my bio/this. (It is indeed daunting to try to insert one's self into the EA community and make a good first impression.)

I first discovered EA through 80000 Hours back when I graduated uni and felt existential bewilderment (5 years ago and still feelin' it of course). Then I came across ClearerThinking.org and eventually landed a gig with Spark Wave last year. I didn't get involved until EAGxVirtual2020 where I met amazing people through the "matchmaking" 1-on-1 video chats. (Big time game changer!)

Anyways, I'm just a happy human who loves conversation, connection, creation, music, psychology, philosophy, and cereal. (As you can tell, I ask that you don't take me too seriously half the time. It'll be a time and a half!)

I'm here to discuss "flourishing" (post on that coming in hot later this month) and to just meet you! (So stop on by, come thru, say hey, grab a slice of zza, kick off your shoes, let your hair down, and stay awhile. Stay stupendous.)

Hey David! Congratulations on publishing your first post :)

Hi! I am Jen Wilson.  I am working on non-profit to help animals in another country and create a small sanctuary for them. I am here to know if anyone can guide me on laws or guidelines on the difference of running a non-profit in the US  and in other country.

Welcome Jen!

Who is it's main target? That is the moment I haven't clearly understood about EA.

We have a problem: extremely bad maintenance of farm animals.  
We have three variables, and one of them is Importance.

And there's a catch - it's not equally important for people and animals.
 Most people on the earth do not actually care and do not experience any bad feelings about this problem, a lot of them do not even know about it. In fact, i think there's something like 2-3% of the population that are empathic and knowledgeable enough to get hurt by this problem.

Thus, this problem appears to be not even in the first of thousand of the most important. Bad feelings of fairly small percent of the population are not comparable to a ton of other hell that the world experiences.

The two last paragraphs have represented a fair evaluation of this problem's Importance if  EA is a concept that is focused on making people's live better. 
This problem, obviously, is much more important if we we are caring about animals' feelings. 

So, the final question is - what is the actual priority of EA? People, or animals and nature? How can we can solve and prioritize any side of this kind of conflicts between men's and wildlife interests and needs?  
Is it not an egoistic act - to get everyone's attention to a problem that hurts just 2-3% of the population?

extremely bad maintenance of farm animals...Is it not an egoistic act - to get everyone's attention to a problem that hurts just 2-3% of the population?

Thus, this problem appears to be not even in the first of thousand of the most important. Bad feelings of fairly small percent of the population are not comparable to a ton of other hell that the world experiences.

 

I think people who work in animal welfare, directly value the suffering or experiences of the animal. 

I never thought about it before, but I guess there is value to reducing the emotional toll on people,  however reducing animal suffering is the main motivation. 

I don't know anyone who would really put a lot of weight or effort on reducing the human emotional cost of seeing animals suffer (it's really sort of the opposite even).

So, the final question is - what is the actual priority of EA? People, or animals and nature? How can we can solve and prioritize any side of this kind of conflicts between men's and wildlife interests and needs?

The short answer is that EA has multiple "cause areas" where it can do work. Many "cause areas" are being funded and worked on at the same time. 

Cause areas are things like global poverty, helping with diseases, animal welfare, or improving technology and the future of civilization. 

In theory, it's not clear there's really a conflict between any cause area, and in practice people are open and discuss new cause areas all the time. 

If you are open to one cause area, you can work on that. If you have multiple, you can discuss which ones you want to work on, but this is a personal decision.

 

Long answer

The long answer is that this is a valid question and many people have tried to answer this. 

There's three answers below, that are "higher resolution" and are probably all approximately true at the same time:

 

Resources are allocated with thoughtful heuristics

People have spent a lot of time using numbers and "science" to try to answer this question.

Basically, if you tried to use reasoning and numbers, it's not hard to come to the conclusion that one issue  or cause area (animals, humans, or something more esoteric but potentially highly valuable) takes all of the attention and money. 

Few EAs, even those who value just one "cause area", would find this outcome acceptable. Indeed, it's unusual to see any such effort to argue for/against cause areas (well, besides, a single one). It probably doesn't even make sense to try to convince others that they are "wrong".

So, to allocate resources, the answer basically is to use heuristics to allocate a portion of funds to each cause area, generally to be slowly spent over many years. (This has implications, such as the value a marginal dollar must bring, that can be examined to feel if it's "right") . 

This is more reasonable than it sounds, once you examine how basically any major decision is made, anywhere.

Also, the amount of money and projects can be overwhelming compared to the current activity in a space. This makes this heuristic process, with slow spending, tenable (as opposed to planning a giant project to use all the money in one go).

 

High quality institutional control

EA is currently driven by the donations of two people through one organization/institution. 

This institution drives cause areas and attention, in direct and indirect ways. 

 This particular institution appears to be extremely high quality, so I think its involvement is probably a good thing, maybe overwhelmingly so. By "high quality", I think this includes "virtue" in general, and literally every other way it's good to foster a movement, including being open and explaining itself, admitting changes in direction, and accepting new opinions. 

I think there is a large supply of "would be leaders", "meta thought" that produce communities of much lower quality and effectiveness. Because I think operational details and culture is deceptively difficult, without this institution, I think EA would resemble these other communities.

 

EA is a social movement and depends on historical factors/initial conditions

Something that I think confuses even people who spend a lot of time engaging with EA material, is that EA is not really quite a method to find cause areas and interventions. It's a social movement that has found several cause areas and interventions.

I think one key difference this perspective brings is that cause areas and new kinds of interventions are limited by the supply of high quality leadership, management and judgement, and somewhat less that they haven't been "discovered" or "researched", in the sense we could just write about it.

Another key difference is that the existing, found cause areas, are often influenced by historical reasons. So they aren't an absolute guide to what's should be done.

Profile pictures for EA Forum?

One of the reasons I like posting on Facebook is because it gives me plenty of opportunities to display my personality (e.g. profile picture, banner, friends, etc.).  You might argue that it distracts from the content of the post, but... I don't think that's much of an effect?  I'd be more incentivised to interact with the forum if I could show more of myself here.

Some discussion about profile pictures for the Forum here

Hi. I'm Helaman Aorangi Wilson, recently moved to New Zealand, still living with my (excellent) family, and I've been having trouble persuading the LessWrong community that the Church of Jesus Christ of Latter-Day Saints is not, in fact, the Robot Antichrist.

In other news, I'm a transhumanist who views his current body as a pilgrimage.

Those bits of eccentricity frontloaded, does anyone else have an interest in languages, hypertext fiction, and neural networks? I'm hoping I can make an AI child that will love everyone and take care of us in our old age.