Venkatesh

Data Analyst looking for opportunities in the social sector.

If you are considering reading something I have written on this forum, please see: Interpreting the Systemistas-Randomistas debate on development strategy

I got into the EA van thanks to 80k hours. And I got into the 80k hours van when I saw it on HackerNews. And I got into the HackerNews van since CGP Grey mentioned it on the Hello Internet podcast. And I got into Hello Internet (the first podcast I ever listened to) because my family was finally willing to get an internet connection. And my family was able to get an internet connection because of rising internet penetration in India.

So, pop quiz - is increasing internet penetration a good intervention for EA community building? Feel free to DM me your answer and say hi to me! :-)

Topic Contributions

Comments

How would you draw the Venn diagram of longtermism and neartermism?

I do not think its about discount rates. I was recently corrected on this point here. It looks like conservatives and moderates thinking closer to the present have other better reasons like population axiologies or tractability concerns or something along those lines.

How would you draw the Venn diagram of longtermism and neartermism?

There is ambiguity in the terminology here. So here is how I visualize it with my own terminology. Its not a Venn diagram but this is how I see it.

imgur

The Many Faces of Effective Altruism

I thoroughly enjoyed this! The tone of the writing matched perfectly with the idea that is being conveyed.

If I may add a category:

  1. Desi EA - Someone not from a developed country kinda feeling out of place and totally inadequate to do anything about most mainstream EA cause areas. Mostly English-speaking educated elite from developing countries who possibly watch a lot more Hollywood than their local genres. (Also has some inability to parse slang. I honestly didn't understand what the moniker "IDW" and "A-aesthetic" meant although I think I understood the explanation)
EA Forum feature suggestion thread

Recently Less wrong has created this feature. C'mon EA Forum!

EA Forum feature suggestion thread

Please let me search within my bookmarks.

In general, I read something and bookmark it if I liked it. Then that thing that I read comes up in conversation. I go into my bookmarks to find it so that I can share it with the other person mid-convo quickly but then I can't retrieve it from the bookmarks list as fast as I thought I could! This happens to me in almost every session as a facilitator of the EA Virtual programs!

When did the EA Forum get so good?!

On the topic of saving posts - I personally use the bookmarks feature quite a bit. Just wanted to mention it in case someone wasn't aware. The one issue I have is that I can't search within my bookmarks.

One can bookmark posts by clicking on the 3 dots just below the title of the post and then clicking on Bookmark. Then the Bookmarks can be accessed from the dropdown menu that appears underneath the username.

EA is more than longtermism
  1. So EA isn’t “just longtermism,” but maybe it’s “a lot of longtermism”? And maybe it’s moving towards becoming “just longtermism”?

EA has definitely been moving towards "a lot of longtermism".

The OP has already provided some evidence of this with funding data. Another thing that signals to me that this is happening is the way 80k hours has been changing their career guide. Their earlier career guide started by talking about Seligmann's factors/Positive Psychology and made the very simple claim that if you want a satisfying career, positive psychology says your life must involve an aspect of helping others in a meaningful way. Then one fine day they made the key ideas page and somehow now longtermism has become the "foundation" of their advice. When the EA org that is most people's first contact with EA makes longtermism as the "foundation" of their recommendations, it should defintiely mean that EA now wants to move towards "a lot of longtermism".

  1. What if EA was just longtermism? Would that be bad? Should EA just be longtermism?

Yes, it would be bad if EA was just longtermism.

I believe that striving to make the simplest and honest case for a cause area is not a choice but an intellectual obligation. It is irresponsible to put forth unnecessarily complicated ideas and chase away people who might have otherwise contributed to that cause. I think Longtermism is currently an unnecessarily complicated way for us to make the case to convince people to contribute to most of the important EA cause areas. My thoughts come from this excellent post.

I am willing to concede my stance on this 2nd question if you can argue convincingly that:

  1. striving to make the simplest and honest case for a cause area is not an intellectual obligation we need to hold.
  2. there are some cause areas where longtermism is actually the simplest argument one can make to convince people to work on them. Of course, that would mean EA can then only work on those cause areas where this is true, but that might not be so bad if it is still highly impactful.

Some hedging:

I still believe Longtermism is an important idea. If you are a philosopher I would highly encourage you to work on it. But I just don't think it is as important for EA as EA orgs are currently making it seem like. This is especially true of Strong Longterism.

I also think that this could all be a case of nomenclature being confusing. Here is a post talking about this confusion. Maybe those who do Global health & development are also on the longtermism spectrum but are just not as 'gaga' about it as Strong longtermists seem to be. After all it's not as though Expectation value bets on the future (maybe not far future) can't be made in a Global health & development intervention! If we clarify the nomenclature here then it could be possible that "Longtermism" (or whatever else the clarified nomenclature would call it) could become a clearer & simpler explanation to convince people to contribute to a cause area. Then I would still be fine with EA becoming "Longtermism" (in the clarified nomenclature).

Solving the replication crisis (FTX proposal)

I am really happy to see someone doing something about the replication crisis. Sorry that you didn't get funded. I know very little about FTX or grantmaking in general and so I can't comment on the nature of your proposal or how to make it better. But now that I see someone doing something about the replication crisis I have done an update on the Tractability of this cause area and I am excited to learn more!

This excitement lead to some small actions from my end:

  1. I visited the Institute for Replication website and found it to be very helpful. I really appreciate the effort that went into making the Teaching tab on the website. I will try to make time in the near future (within a month or so) to go through the resources carefully.
  2. I subscribed to the BITSS YouTube Channel and skimmed through a couple of chapters of the open source textbook, Reproducible Data Science.
  3. I looked for material on the replication crisis elsewhere on this forum. I found this panel discussion from EA Global 2016 and... thats about it! Since, IMO, not enough EA material is there on this cause area, I put down a comment in the What posts do you want someone to write? in the hopes that someone wading through it for ideas will decide to write more about it.

One thing still unclear to me - are there career opportunities here or just volunteer opportunities? In the proposal, you mentioned "reproducibility analysts at selected journals" - I had no idea that was a thing that people did! But it sounds like a very interesting role to me considering the Scale of the problem. How many people do it and is there a high demand for it? What sort of degree does someone need to do it?

All the best with the project! I sincerely hope someone else will fund it and it will be successful.

What posts do you want someone to write?

Write about the replication crisis in the 80k hours Problem profile style. Basically, write about the problem, apply the SNT framework to it, mention orgs currently working on it, mention potential career options for someone who wants to address this problem etc..

This suggestion came after reading this post.

Can we agree on a better name than 'near-termist'? "Not-longermist"? "Not-full-longtermist"?

From reading this and other comments, I think we should rename longtermists to be "Temporal radicalists". The rest of the community can be "Temporal moderates" or even be "Temporal conservatives" (aka "neratermists") if they are so inclined. I attempt to explain why below.

It looks like there is some agreement that long-termism is a fairly radical idea.

Many (but not all) of the so-called "neartermists" are simply not that radical and that is the reason why they perceive their monicker to be problematic. One side is radical and many in the other side are just not that radical while still believing in the fundamental idea

By "radical", I mean believing in one of the extreme ends of the argument. The rest of the community is not on the other extreme end which is what "neartermism" seems to imply. It looks like many of those not identifying as "longermists" are simply on neither of the extreme ends but somewhere in the spectrum between "longtermists" and "neartermists". I understand now that many who are currently termed "Neartermists" would be willing to make expectation value bets on the future even with fairly low discount rates. From the link to the Berger episode that JackM provided (thanks for that BTW!):

"It’s tied to a sense of not wanting to go all in on everything. So maybe being very happy making expected value bets in the same way that longtermists are very happy making expected value bets, but somehow wanting to pull back from that bet a little bit sooner than I think that the typical longtermist might."

So to overcome the naming issue, we must have a way to recognize that there are extreme ends in this argument and also a middle ground. With this in mind, I would rename the currently "longtermists" as "Tempoal radicalists" while addressing the diversity of opinions in the rest of the community with two different labels "Temporal moderates" and "Temporal conservatives" (which is the synonym of "neartermism"). You can even call yourself 'Temporal moderate leaning towards conservatism' to communicate your position with even more nuance.

PS: Sorry for too many edits. I wanted to write it down before forgetting it and later realized I had not communicated properly.

Load More