A

AlexCuevas

15 karmaJoined Sep 2014

Comments
10

Never saw that skillshare resource, thanks for the link.

I was hoping to find a platform that already existed within the general public, so we could benefit from people based on their skills & teaching ability rather than their EA orientation. We even get some advocacy potential there if we form bonds with our teachers/students and get to explain why we want to learn/teach certain skills.

If no such platform exists, I think making one could be a very high-value proposition. There are definite advantages to the ladder structure vs lecture style teaching.

Does anyone know of any "Go teaching ladder" - type resources for skills other than Go? As a concept, this was discussed at the 2014 EA summit, and it sounds like a great idea for effective learning of many disciplines.

1) I think this is a good use of time, because podcasts are an increasingly relevant medium and I don't know of any that fill the niche you are looking for.

2) I have recently been planning a podcast of my own along almost exactly the same vein, taking inspiration from EconTalk as well as The Tim Ferriss Show (which involves interviews with "peak performers" and other highly successful people to see how they work).

I do not have podcasting experience but I have listened to numerous podcasts for inspiration and in classic EA fashion I've been overthinking the heck out of it, so if you are interested I can share my thoughts with you directly. I have also invested in a microphone and basic editing software to record some initial conversations and test the waters. Very low profile and preliminary right now, you can probably find better, but I am here and willing to lend my manpower.

Any thoughts on the EA value of improving writing skills? Often I see EA writings as verbose and difficult to hold my attention. Obviously there is a trade-off between rigor and readability. I think EAs in general could do a better job of favoring readability (especially myself). This would also be helpful for newcomers to EA.

Possible solutions:

  • Writing classes
  • Extra time editing before publishing
  • More non-text media in content creation (diagrams, vlogs, podcasts, etc.)
  • Bullet points, TL;DR's, and smarter paragraph structure in long-form writings

Personal role models of effective writing:

  • Ryan Carey
  • Sam Harris
  • Confucius

Thoughts on this?

"True, but a lot of people are also struggling just to find a job that would be both enjoyable and provide a sufficient wage to pay the bills."

Agreed, so in that context, how does it make more sense to tell somebody that they should care about helping other people as much as they possibly can? I don't see that train of thought getting through to many people in this situation.

I don't think it's a matter of reputation as much as a matter of socialization and network building. Humans are at their best when they're interacting with other humans (generally speaking). If your actions based in EA motivations are hurting your personal relationships or your ability to socialize, either by constraining your living situation or by limiting your social interactions due to cost / time considerations, then they may be doing more harm than good. I think building a strong network of friends and colleagues is one of the highest-leverage things you can do, and shouldn't be easily discounted for the sake of simply giving as much money as you can.

Similarly, while going overseas for vacation is expensive and bad for the environment, sometimes seeing another place or culture in real life can have a lot of altruism-related benefits.

I don't mean this to disparage your particular life choices, but rather to say that a "typical" EA shouldn't be expected to make the same choices, and it shouldn't be implied that making those choices makes you more effective or more altruistic than somebody else who focuses more on network building and travel, for instance.

I definitely agree, and as a result I wouldn't cater my advice to only one sort of person. I think it's best to take an approach where you change the advice you give based on who you are talking to. Perhaps we should have some sort of portfolio of starting advice to give based on simple diagnostics. I'm sure 80,000 hours does something like this, so it's not new ground. I think this is way better than saying "everybody should donate 10% of their income right now if you can afford it or you're not a real EA." And yes some people have said this. I find this to be a huge turn off personally.

ruthie: "It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster." I'm not sure I agree with this. I know surprisingly few people that are both actively altruistic and who actually think critically and examine evidence in their every day lives. I wish this was everyone, but realistically it's not. I do believe there are a ton of people who would be interested EA that haven't discovered it yet, but I think that the people who will ultimately be drawn in won't be totally shunned by the fact that a lot of the info is catered to demographics that aren't exactly like them. Especially since there is such a large range of socioeconomic status that a person could reside in, and each one might have a totally different EA approach that works best for them (and I'm not even talking about cause selection yet).

What if somebody has no interest in donating, but they are interested in career choice? Or interested in lifestyle change? Or interested in saving, researching, and donating later? Or interested in advocacy? Or interested in personal development? There are a lot of options, and I think telling everyone the blanket advice "just start donating now to GiveWell's top charities and don't worry about the meta stuff" will turn off many people in the same way that "focus on yourself until you have more income leverage" might turn people off. I haven't seen any real evidence either way, just some armchair arguments and half-baked anecdotes, so I don't understand why everyone is so confident in this.

"I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets."

Sure, but I think your use of the term elitist is a bit unfair here. I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it's less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people's problems better than they can. Maybe this is why I am arguing for this angle of attack; I have friends that have different off-putting triggers as you.

I agree that we shouldn't make broad generalizations about EA demographics, but at the same time we shouldn't misrepresent them; I would wager that a large number, if not the majority, of prospective EAs would fall under the high-potential-earners-in-their-20's demographic and this is very relevant in the discussion of how to advise people who are just getting into EA. I definitely agree that the same advice wouldn't work equally well when addressing every person, and sometimes it's correct to give two different people completely opposite advice. That being said, if I had to give 1 piece of advice in a generalizing way, I would want to consider the demographic of who I am giving this advice to rather than assuming that it is directed towards the median US citizen, for instance.

I think we can both agree that the way you say things is very important. Saying "come back when you make more money" is very different from saying "if you are interested in helping people as effectively as possible it may be wise to consider looking out for yourself first before turning your motivations outward." There are a lot of reasons for people to worry that their lives are too good in comparison to others' and therefore they have a moral obligation to help. I think a lot of EA's have felt this way before. When faced with this sentiment, I think it can be a mistake with regard to actually being effective to devote significant effort into explicit donation rather than personal development.

I think you are also framing the argument to make "making more money" sound like a bad thing that most people don't want to do. A lot of people already want to make more money, and they feel a conflict between trying their best to become successful VS using the resources / leverage they already have to help others. My argument is that focusing on personal goals and development could kill two birds with one stone for a lot of people and I don't think it's as off-putting as you make it sound.

Speaking from my own experience, I have a very high propensity to think about others before myself and I think this can be a flaw and limit productivity in many ways. I think I would ultimately be a more effective altruist if I had spent more of my time pondering "how can I become really good at something / develop valuable skills" rather than "how can I do the most good."

There was a great thread in the facebook group on whether people making a modest wage (around or below $30k/yr in US terms) should be donating to effective charities or saving money. I'd like to weigh in on this but that thread is already pretty crowded and unstructured.

The proposition here is "People with average or below average income should save money rather than donate to effective charities"

One thing that it looks like almost nobody mentioned is the opportunity cost of worrying about other people over yourself and how this corresponds to effective altruistic output. It seemed from facebook that most EA's were against the proposition, claiming that most people in the developed world are still far better off than X% of the global population and therefore they should still be donating some percentage of their wealth. I believe there is a strong case to be made that focusing on optimizing one's own career capital, not just making smart personal finance decisions, will enable one to earn substantially higher income in the future and thus be a more "E" EA. Any intellectual power devoted to understanding the EA argument (doing the relevant research, picking an EA organization or EA-organization-recommended charity to donate to, and "stretching your EA muscles" by donating a small amount over a regular period of time) is a small investment in terms of money but a large investment in terms of intellectual capital that I think dedicated EA's tend to discount because they have already invested this capital. This is a CFAR-esque argument that advocates focusing on personal development and improvement until one is at a level to reasonably maximize one's own output both in terms of income and effectiveness of donations.

I am still uncertain about my position in this debate, but it seemed that most EA's (at least on facebook) were strongly against the proposition so I would like to see more discussion taking the above points into consideration.

Load more