All of Ulrik Horn's Comments + Replies

I want to gently push back on this being more EA than not undergoing surgery and removing important parts of one's body. I think you might be influenced a lot by your feelings (I know I am despite trying to be rational - I am rational about how irrational I am!). Therefore, I could imagine myself in your situation feeling a strong pull to be an organ donor because you feel like you are not giving enough otherwise. I therefore strongly advice to wait a few years after you have graduated and you have a comfortable, stable income. This way you will have compa... (read more)

9
Bob Jacobs
7d
I mean it's not an important body part, you can live perfectly well with only one kidney, which is why I'm giving it away. If by some cruel twist of fate I do end up needing another kidney, I'll be on the top of the recipient list thanks to my donation. Of course I am, empathy is a feeling after all. I don't see why this is a reason to not do it. I will not do the procedure during the school year, and will take as long as I need to recover afterwards. I'd prefer to do it sooner rather than later, since earlier interventions are almost always better than later interventions due to the higher amount of knock on effects (e.g. if I convince someone to be vegan now it's better than years in the future, since I'm saving the animals in the intervening years). Also I study ethics, so a "comfortable stable income" is probably not happening anyways :)

I am curious what you think of a first-principled approach to resiliency/preparedness? I wrote a blog post on this on LessWrong. I still have a feeling that from an individual, and perhaps from a nation state's perspective, one will arrive at quite different resiliency measures if one carefully starts with the likelihood of different disasters affecting one's loved ones, then the likelihood and cost of different interventions mitigating these disasters and in the end having a prioritized list of most cost-effective preparedness actions. It would, for examp... (read more)

1
SimonKS
7d
If we look at moon & mars colonization - radiation is a large risk, earth's magnetic field means we don't need to care about solar radiation so much, but there's no reason that's permanent, Mars also used to have a magnetic field. I think there's something to be said for physical isolation, the more physical material you can put between yourself and other environments the smaller the chance that bad stuff get's through your barrier and get's to you. Cost effective is another interesting question - if the cost of subterranean building decreases rapidly, it may be the most cost effective solution - certainly on paper it will probably cost less than space colonization as the materials are readily available along with an advanced value chain of goods and services. 

Am I correct in interpreting your comment as something like "Rebecca says it's costly to say more which might imply she is sitting on not yet disclosed information that might put powerful EAs in a bad light"? I did not really pick up on this when reading the OP but your comment got me worried that maybe there is some information that should be made public?

6
David Mathers
14d
'Am I correct in interpreting your comment as something like "Rebecca says it's costly to say more which might imply she is sitting on not yet disclosed information that might put powerful EAs in a bad light"?' Yes, that's what I meant. Maybe not "not all ready disclosed" though. It might just be confirmation that the portraited painted here is indeed fair and accurate: https://time.com/6262810/sam-bankman-fried-effective-altruism-alameda-ftx/  EDIT: I don't doubt that the article is broadly literally accurate, but there's always a big gap between what claims a piece of journalism like this is making if you take it absolutely 100% literally line-by-line and the general impression you'd get about what happened if you fill in the blanks from those facts in the way the piece encourages you to. It's the latter that I think it is currently unclear how accurate it is, though after Rebecca's post I am heavily leaning towards the view that the broad impression painted by the article is indeed accurate. 

I loved this episode as it clearly laid out the challenges with nuclear weapons and looked at possible interventions. I am a bit curious why it was "demoted" to after hours - it felt perhaps more relevant than some recent "main show" episodes on evolutionary X (evolutionary history, evolutionary psychology, etc.). Or maybe you are trying to draw in a wider audience by covering a wider array of topics, including topics starting to fall outside of priority causes.

Thanks! That does seem super relevant. Is my understanding below roughly correct?

"A post on a non-community topic that receives ~500 karma is roughly equivalent in impact to a high quality research paper in a peer reviewed journal. And such a post receiving about 100 karma or a bit below is about 1-10% of the impact of such a journal article."

2
Vasco Grilo
21d
You are welcome! This makes sense assuming i) impact increases logarithmically with karma, ii) Nuño's impact estimates of the EA Forum Prize winners are correct, and iii) the relationship between karma and impact among these posts generalises to other posts. However, I have so little confidence about these assumptions that I would not use karma as the only proxy for impact. At best, I would use the logarithm of karma as one component of a weighted factor model (WFM). I note in the post that:

I was wondering if our continued conversation would be better as a new post using the conversation format if that's still in use? That said,I only want to do that if people find it helpful - I got pretty down voted including by people of the marginalized groups that I feel "bad because of".

Fantastic work. Would you be able to, if you think it is advisable, to have some sort of "adjusted JEID" score? I am thinking that since EA is mostly white and male, that if the community, in its current form, had been more "equally distributed across gender, race, etc", that the JEID concerns would have loomed even larger?

Very simplified, something like if 20% of respondents identified as POC, and JEID issues were raised by 15% of respondents, that we could do something like "if EA was 50% POC, the JEID issues would be raised by 37.5% of respondents". Thi... (read more)

Thanks Ulrik!

We can provide the percentages broken down by different groups. I would advise against thinking about this in terms of 'what would the results be if weighted to match non-actual equal demographics' though: (i) if the demographics were different (equal) then presumably concern about demographics would be different [fewer people would be worried about demographic diversity if we had perfect demographic diversity], and (ii) if the demographics were different (equal) then the composition of the different demographic groups within the community wou... (read more)

I will answer as briefly as I can so please double click on anything you would like me to respond to in more detail. And I think the value of me responding here is to let others know better how I feel and the mechanisms at work in making me feel alienated from EA.

Is Heightism even a thing?

2 responses:

  1. One can assume that a person of shorter stature has heard comments on their height many times before. And I think few such people find it flattering. I am interested here in describing my emotions and how EA differs from other spaces I am used to navigate, thu
... (read more)
TJ2
1mo10
1
0

(I am the same TJ that wrote the original comment. I wasn't able to log back into the original account so I created a new one.)

I'm not sure what you mean when you say "whiteness is true". I feel like you just told me "happiness is true". It just doesn't parse. Happiness and whiteness are adjectives, they don't have truth values on their own. If you tell me that a particular person is happy or white, then it becomes a claim about the world that is true or false.

That said, we probably do have some kind of genuine disagreement about how common racism/sexism i... (read more)

Thanks for spending time to respond. Currently I will respectfully decline to engage further and I think I can sum it up briefly:

You offer alternative framings but unfortunately it does not sway how I feel. Not sure I can control my emotions although I wish I sometimes could. And there is the more complicated issue that people close to me also get hurt by EAs - I definitely think it would be a long shot to make them see EA differently. In fact, and unfortunately I think what I feel to be a slightly infantilizing vibe in your reply just make me feel more aw... (read more)

Case in point on outreach beyond EA. I'm sure 80k hrs and/or CE has thought about this, but it might be a "missing" skillset in EA. I also remember seeing this note on sales people perhaps having a hard time to find work in EA.  My comment here is not thought through at all, but I have a hunch that people good at getting the attention of people and knowing how to network/find their way in corporations and/or governments can be a good skill set in EA, especially paired with someone technical/subject matter expert.

Hi Rebecca and thanks for taking the time to patiently engage with this topic - I think that is important.

I agree 100% that people should push back if they feel like it. And I absolutely see the perspective of those that feel like they have to censor themselves in EA settings and that this also causes alienation. I kind of feel EA has 3 choices here:

  1. Continue trying to find a middle ground, alienating people on "both sides", leadership/prominent figures awkwardly silent on the topics
  2. Embrace the "all discussion is good" and do little in the way of DEI, alien
... (read more)

This was such a good article. It changed my mind about conservationists - I had no idea they were so into the value of evolution in and by itself and at the expense of welfare of both animals and humans. Your post really opened up my mind to a new perspective on this. And I really like how you social science style talked to all these different people to really push them on their beliefs and motivations. And it feels EA related and shows the complexities involved in wild animal suffering. Great job and thanks! 

Edit: For people downvoting this, please don't if you do it because that data is already available. I made this comment before Lorenzo's excellent data was published (see his comment, above).

I think having data at least just roughly on how many percent there is of each would help me. If its 95% and 5% it feels like we lack diversity in experience. If it's 30% and 70% I think I would be less worried.

I appreciate your comment, you are right. I should have dropped the "white" word in the example of your post. I think you are correct, it does not matter, I would have been upset regardless. And I do apologize if you are not in fact white - like I said I make mistakes on a daily basis. I know this is perhaps a cop-out but... I seek draft amnesty? Still, I feel worse if such posts are made by white people. Like I said I am explaining how I feel, and this cannot be argued about. I made this post due to popular demand and I meant it mostly so that people inte... (read more)

7
Rebecca
1mo
I think it’s reasonable to focus on expressing an experienced sentiment, but I think it’s also fair for people to push back on the sentiment. There are after all people who have felt alienated from and pushed out of EA as a result of the active shaping of forum content to be more agreeable. I think it would be quite bad if forum mods began to remove posts on the basis that something existing on the forum constitutes an endorsement by CEA. I’m not even sure it’s a coherent implication - there are many topics on which posts have been written that disagree with each other, including where someone says a stance is actively harmful. Which position should CEA be taken to endorse? This seems straightforwardly false. Maybe you have are using a very specific definition of politics, but surely many areas cut across most identity categories? For instance it seems quite coherent to possess a stance on climate, and coordinate a movement around it, in a way that is agnostic to identity. I’m not sure what this means, could you give an example of an area of white, het cis or abelist politics?

Super interesting. Based on my understanding of CE's thinking, would not doing something 5x more cost effective than the second best, in a cause area that falls a bit below the cost-effectiveness that EA is interested in, still be EA-aligned? I mean the counterfactual impact can still be huge if you are affecting thousands or millions of lives and moving large amounts of money/resources and doing this much more cost effectively than the best, existing player. A proof point would be a CE-incubated charity that now is absorbing significant funding from either something like USAID, Bill and Melinda Gates Foundation or similar.

I think this is key. I get the impression (and others do as well) that EA is all-or-nothing. Either you give 100% to AMF or you are not EA. 

A private foundation that is focused on the state of New York can use EA principles in trying to identify the biggest impact they can have, within their constraints, and that is still EA. I think even the cause area constraints that are the least EA (say, the arts), can still find ways to improve their impact using EA principles. Though of course that would be more difficult.

Love this episode! I really appreciate all the dissemination of work on prioritizing animals - it really helps me understand the case for animals. I just wished someone would do something similar for x-risk. I feel like I understand really well why we should work on animal welfare, but I feel like I understand much less about why we should work on x-risk.

I am not saying I think x-risk is unimportant. I am just saying I feel like I defer a lot more on x-risk while for animal welfare I could to some degree explain why it is important to a non-EA friend of mine.

I liked this post because I think it is helpful to especially give newcomers and outsiders a feeling that women, PoC, etc. also belong in EA and can reach positions of influence. I do also at the same time agree with other commenters about not focusing too much on people but rather ideas. Still I think Peter, Will, Toby, Nick etc. still get quite a lot of attention (even though I know at least some of them are trying to step away from the limelight!) and therefore that until that stops, it is net good to also highlight other people than white, male presenting people.

I should add that I would not have liked the post if I did not think Katja sounds like a fantastic person - it is not some pure "affirmative action"!

Good comment, and good point on how in general "one bad apple can spoil the barrel". I am really not sure how much that is at play, or if it is just lack of awareness. I hope the latter! In fact, in one of my examples I know it was the latter but it still leaves a bitter taste in my mouth.

will it have that much of an impact

You mean if any DEI initiative will have that much impact? Just a bit unsure what you are referring to with "that". Once clarified I will try to respond. 

Hi Rebecca, I am realizing after posting and after your insightful comment that perhaps my feelings about DEI maybe is at least to some degree some sort of male/white guilt and that I am overcompensating. And it is a good point that I might be projecting my biases too strongly onto others who share my privileges - I did spend the first 18 years of my life in a very white environment, for example, so am probably wired quite differently from someone that grew up somewhere more diverse. Your comment is definitely well taken and makes me update towards being e... (read more)

Thanks, I do not know what to think of that. I guess I am updating ever so slightly to treading more and more carefully as at least some people are indicating that my experiences might just be bad luck/me being a bit too sensitive around this topic.

And I am not sure what to make of the two agree votes on your comment - does it mean that other people also suspect others of knee-jerk downvoting or are the votes from people who admit to such knee-jerk downvoting. I am guessing the former as if they don't read much they probably do not dig into comment fields.

I wouldn't update toward "some people are indicating that my experiences might just be bad luck/me being a bit too sensitive around this topic" based on a few unexplained downvotes.

As I write this, your post is sitting on +19 with 15 votes; the number of downvotes is unknown but 4-7 might be a reasonable estimate. Based on past comments and voting patterns, there are way more than 4-7 "anti-woke" people (to quote @titotal) on the Forum, and some of them have a decent chunk of karma. 

There are also other plausible reasons to downvote the post, so it's ... (read more)

I'm in one of the "marginalized" groups the post highlights but my experience with EA is good/you do not really represent how I feel

I'm a PoC/other marginalized group and I find this wrong/offensive

Something else, think harder and propose other reactions and I might vote on them if you identify the right one

Something else but please stop making all these comments

I do not think we have any issues that need resolving/the worries about DEI are exaggerated/etc.

I want solutions and optimism/no need to bash EA more/etc.

I am reacting to the emotional appeal/lack of rationality/lack of data or something like this.

If you downvote (no offense taken!) can you please indicate why by voting in my sub comments to this comment? I am asking as I know there is cancel culture and I can definitely understand if people do not want to be publicly saying something strongly negative about this post. I think having a general idea why people react negatively is helpful for the larger community.

6
Jason
1mo
Didn't vote, but I can imagine that some people see "DEI" and reflexively downvote (or upvote!) the post without actually reading any significant portion thereof
1
Ulrik Horn
1mo
I'm in one of the "marginalized" groups the post highlights but my experience with EA is good/you do not really represent how I feel
1
Ulrik Horn
1mo
I'm a PoC/other marginalized group and I find this wrong/offensive
1
Ulrik Horn
1mo
Something else, think harder and propose other reactions and I might vote on them if you identify the right one
1
Ulrik Horn
1mo
Something else but please stop making all these comments
1
Ulrik Horn
1mo
I do not think we have any issues that need resolving/the worries about DEI are exaggerated/etc.
1
Ulrik Horn
1mo
I want solutions and optimism/no need to bash EA more/etc.
2
Ulrik Horn
1mo
I am reacting to the emotional appeal/lack of rationality/lack of data or something like this.

Thanks Milena! I am not sure what aspects of either environment might drive this difference. I am not sure what other PoC or women had in terms of experience at the company in Bristol. What I do know is that they did not really do anything on the DEI front. So maybe it was some sort of selection effect that hiring managers implicitly applied in their hiring?

I think in terms of EA, maybe I will walk a bit back on my suggestion to look at Deloitte or others - maybe we are unique enough in terms of not being a company, being pretty unusual overall etc. that t... (read more)

2
Milena Canzler
1mo
Maybe some of it comes down to differences in the broader environment. The UK has larger (visible) proportion of People of Colour compared to, I guess, Sweden and in my case, Germany. So while that doesnt mean that all people in the UK are anti-racist or so, having more interactions with a diverse range of people might make it more likely that you'll learn a thing about not offending. Plus, it might not be that interesting for someone to ask the "Where do you come from?"-question if they've heard the same answer a hundred times: "From Bristol".  I think it's good idea not to move fast and break more things with this stuff. I've made that experience, and will likely make that experience a few times more. But trying small, collaborative experiments sounds good!

I am wondering if there are generally strong enough recommendations on building a substantial personal runway? I am thinking that one might actually want to do something like the following:

-Before applying for a grant/applying for non-permanent/project based work, perhaps even target a personal runway of 12 months, especially if you have dependents?

-Then when you are applying, calculate the salary/rate you think you need.

-Then assume you might burn up to 6 months of your runway on working on this grant (either before it starts and/or when waiting for follo... (read more)

Yeah, I tried to emphasize the ascetic part haha! Oh and the tofu I eat is straight out of the packaging, I just give it a rinse. So pretty bare-bones!

2
Milena Canzler
1mo
Hehe, it certainly is ascetic!  I actually eat smoked tofu the same way, no need to fry it, it's super tasty that way :)

FYI I just published a draft of this post. Thanks to everyone who encouraged me to write it by voting on this answer.

Second this. I think it could be potentially really fun to listen to by focusing on the hardships of doing ops work, development etc. like grunt work that is super important but not glorifying. So it can be a bit like listening to ultra marathon runners talk about their run, how hard it was, major pains they encountered and how they overcome it. Kind of like celebrating shlep in EA. This way, one can also get more people to be excited and feel rewarded for doing hard and boring stuff, which some senior EAs have seemed to indicate we need more of, and less of "galaxy brain fun and wild ideas".

Here is some more discussion on a very similar topic, if anyone wants more ideas. Brad and I seemed to have had this thought more or less at the same time! 

Yeah perhaps, but I have no idea even why the forum is considered cost effective. Is it more because it advances work on causes? Or is it more because it builds and maintains the community? I think the answer to this question will go some way to understanding the strength of your claim. In general, I just wished this information was out there, just something about why we think the forum going is a good use of our time. Right now, I have not seen anything. I remember seeing more information about EAGs and their benefits and their cost effectiveness.

It would, for example, be super valuable to have something like "if you spend 5 minutes writing a comment that gets at least 10 in Karma, the is likely as cost effective as earning $200/hour and donating to AMF" or "if you spend 20 hours writing a post that gets at least 50 karma it is equivalent to earning $300/hour and donating all of that to AMF".

4
Vasco Grilo
21d
Hi Ulrik, You may want to check my post on What is the relationship between impact and EA Forum karma?.

When combined with the difficulty of assessing impact of Forum posts and comments, I think the relationship between karma and impact is too murky to make impact-per-karma a meaningful measure. Posts/comments of a specialized/technical nature will likely have a significantly higher IpK value than meta commentary.

I have a couple of strategies and maybe you already employ some of them. Before I list them, I think it is worthwhile to get some objective assessment of your productivity given the perhaps not too infrequent slumps in productivity. I thought I would be much less productive than the average EA being a hopefully ~equal parent with much less time to work and was surprised people thought I was able to do a lot in a short amount of time! So I do not fret so much about productivity - that in and by itself might cause productivity loss.

Here are what I think are ... (read more)

Perhaps it sounds strange from me that is so active on the forum: But what is the value of the forum, and therefore of contributing to it? Maybe the answer is lying around somewhere, but I have not seen anything. Don't get me wrong, I think perhaps the forum is super valuable, kind of, or perhaps even more important than EAGs - these venues, virtual and physical are more or less the only and therefore core venues for the community to interact. 

If there is some cost effectiveness calc for the forum, I would be super keen to see that written up as a post clearly and easy-to-read. Ideally also with some follow on effectiveness calc or estimate for people volunteering time to post here.

4
Ulrik Horn
1mo
It would, for example, be super valuable to have something like "if you spend 5 minutes writing a comment that gets at least 10 in Karma, the is likely as cost effective as earning $200/hour and donating to AMF" or "if you spend 20 hours writing a post that gets at least 50 karma it is equivalent to earning $300/hour and donating all of that to AMF".

That is a good point. If the work is quick to do for someone super skilled at this, perhaps it is almost quicker to do the work than to try to anticipate its effect? I have some hope that if the results turn out to be shockingly bad (something like women get 3 times as few votes with 90% confidence) that it might inspire this rationality-drive community to take action. Ideally it would just mean people when reading stuff and voting keeps this in the back of their head and perhaps tries to compensate for it - kind of when you force yourself to read stuff yo... (read more)

Thanks Vasco, your cost effectiveness estimate is super helpful, thanks for putting that together (I and others have done some already but having more of them helps)!

And I had missed that post on intelligent life re-emerging - I gave your comment a strong upvote because that points to an idea I had not heard before: That one can use the existing evolutionary tree to make prob dists of the likelihood of some branch of that tree evolving brains that could harbor intelligence.

I have not polished much of my work up until now so I prefer to share directly with ... (read more)

A post on voting statistics on the EAF. I am (perhaps unsurprisingly by now!) interested especially in gender break-down. I would have liked to do this myself with the forum API and getting help from some AI code writer and perhaps also for using user names or descriptions to guess at gender. But I just think I do not have time. I would be super interested to see if there are indications that posts from users perceived to be female gets less votes and/or more downvoted. I guess this is less about what I want someone to write about than what work I would love for someone to do. I think potentially the data is right there in front of us.

3
MvK
1mo
Interesting idea. Say we DO find that - what implications would this have? It seems to me that this data point alone wouldn't be sufficient to derive from it any actionable consequences in the absence of the even more interesting but even harder-to-get-data on WHY this is the case. Or maybe you think that this is knowledge that is intrinsically rather than instrumentally valuable to have?

Risk neutral grantmakers should, if they have not already, strongly consider modifying their position. If such a grantmaker has a choice of an intervention with 1000 utils of potential impact but only 1% chance of working out (10 utils in expectation), and an intervention with 10 utils of potential impact but 90% likely to work out (9 utils in expectation), I would suggest that one should go with the latter at this point where the x-risk community is hopefully still in its early days. 

The reason is that having wins has value in and of itself. I think ... (read more)

7
niplav
1mo
Isn't the solution to this to quantify the value of a marginal win, and add it to the expected utility of the intervention?

Thanks for the mention. ASB had previously estimated $100M-$300M if I remember correctly. After that, a diverse team specified an "ultimate bunker" and I then used reference class forecasting to arrive at a total cost (including ~20 years of operation) of $200M-$20bn. Yes that range is super wide, but so are uncertainties at this stage. Some examples of drivers of this uncertainty in cost:

  • Do we need some exotic SMR with complicated cooling systems (expensive) or can we locate the facility near a stable hydro resource (cheaper) and also what is the power ne
... (read more)
9
Vasco Grilo
1mo
Thanks for the context, Ulrik! Feel free to share links. Your 2nd range suggest a cost of 398 M$[1] (= 10^9/2.51). If such bunker could halve bio extinction risk from 2031 to 2050[2], and one sets this to 0.00269 % based on guesses from XPT's superforecasters[3], it would reduce extinction risk with a cost-effectiveness of 0.338 bp/G$ (= 0.5*2.69*10^-5/(398*10^6)). For reference, below are some cost-effectiveness bars I collected. AnswerCost-effectiveness bar (bp/G$)Open Philanthropy (OP)0.05[11]Anonymous Person1[12]Oliver Habryka1Linchuan Zhang3.33[13]Simon Skade6William Kiely10Median2.17 My cost-effectiveness estimate for the bunker exceeds Open Philanthropy's conservative bar (i.e. my understanding is that their actual is bar; see footnote). However, I think the actual cost-effectiveness of bunkers is way lower than I estimated. I think XPT's superforecasters overestimated nuclear extinction risk by 6 orders of magnitude, so I guess they are overrating bio extinction risk too. Fair point. On the other hand, I think bio extinction is very unlikely to be an existential risk, because I guess another intelligent sentient species would emerge with high probability (relatedly). I wrote that: In contrast, AI causing human extinction would arguably prevent any future Earth-originating species from regaining control over the future. As a counter point to this, AI causing human extinction can be good if the AI is benevolent, but I think this is unlikely if extinction is caused this century. 1. ^ Reciprocal of the mean of a lognormal distribution describing the reciprocal of the cost with 10th and 90th percentiles equal to 1/20 and 1/0.2 (G$)^-1. I am using the reciprocal of the cost because the expected cost-effectiveness equals the product between it and expected benefits, not the ratio between expected benefits and cost (E(1/X) differs from 1/E(X)). 2. ^ If it was finished at the end of 2030, it would have 20 years of operation as you mentioned

If anyone else wants to write this I would love for you to do that. I have some rough initial ideas if you want to DM me. If you do, I would love to know when it's published. I guess in general anyone can take any of these ideas and run with but I guess there is some unspoken agreement that the poster of the idea for the blog should perhaps at least be informed that someone intends to write "their idea". Therefore I wanted to make it clear that I am super excited and would encourage someone else to write this up as chances are I will never get around to do so.

Re your final personal note - I feel a lot like you! Thanks for putting your thoughts out there.

1
blehrer
1mo
thanks ulrik 🤝

Hi Toby! I have a super lay perspective on this so if anyone would like to collaborate on a post I would love for that. Or for someone to just take the idea and run with it. 

On not being able to do anything: I am imagining me in various super powerful positions and thinking if I then see e-waste stopping being an issue. I then think main reason they won't do anything:
 

 

  • Sam Altman - Can't do it because "AI has promised him glory and riches" - he basically seems interested in power/impact/money/fame
  • CEO of Microsoft - Like sam Altman, but with
... (read more)

Good point on mentoring. Would love if you write this to also give tips about mentoring (or whether one can progress without mentors). 

Is this user now inactivated (in case someone reads this comments and knows)? It would be a shame if that person actually did not feel accepted and therefore left. One idea I had when reading this is that EAs might want to connect over other things than EA. For example, hobbies, sports, etc. might be a way for people to connect in EA across "status".

Answer by Ulrik HornMar 04, 202426
12
2
1

I’m a normie, and I think I can be an EA too

I often feel that many EAs are a bit more extreme than me. In fact, I feel pretty normal compared to the vibe I get from e.g. reading the EA Forum. Perhaps just stating all the ways in which I am normal could help other people who feel like they do not belong because they are too normal. Helping them feel like EA is a place for them too.


 

EA is not special, nor should we make it

I recently explained EA to a close friend. His response was "but does not everybody do that already?" I think he is not alone in thinking that generally, EA is common-sense. And if that is true, argue that we should make EA as mainstream as possible.


 

4
Ian Turner
1mo
This could be interesting as a counterpoint to (for example) this essay.
Load more