RyanCarey's Shortform

by RyanCarey27th Jan 202028 comments
28 comments, sorted by Highlighting new comments since Today at 3:32 AM
New Comment

Translating EA into Republican. There are dozens of EAs in US party politics, Vox, the Obama admin, Google, and Facebook. Hardly in the Republican party, working for WSJ, appointed for Trump, or working for Palantir. A dozen community groups in places like NYC, SF, Seattle, Berkeley, Stanford, Harvard, Yale. But none in Dallas, Phoenix, Miami, the US Naval Laboratory, the Westpoint Military Academy, etc - the libertarian-leaning GMU economics department being a sole possible exception.

This is despite the fact that people passing through military academies would be disproportionately more likely to work on technological dangers in the military and public service, while the ease of competitiveness is less than more liberal colleges.

I'm coming to the view that similarly to the serious effort to rework EA ideas to align with Chinese politics and culture, we need to translate EA into Republican, and that this should be a multi-year, multi-person project.

I've thought about this a few times since you wrote it, and I'd like to see what others think. Would you consider making it a top-level post (with or without any additional detail)?

Maybe shortform posts could graduate to being normal posts if they get some number of upvotes?

When someone writes a shortform post, they often intend for it to be less visible. I don't want an automated feature that will often go against the intentions of a post's author.

Do you think they intend for less visibility or to signal it's a lower standard?

Could be one, the other, neither, or both. But my point is that an automated feature that removes Shortform status erases those differences.

Affector & Effector Roles as Task Y?

Longtermist EA seems relatively strong at thinking about how to do good, and raising funds for doing so, but relatively weak in affector organs, that tell us what's going on in the world, and effector organs that influence the world. Three examples of ways that EAs can actually influence behaviour are:

- working in & advising US nat sec

- working in UK & EU governments, in regulation

- working in & advising AI companies

But I expect this is not enough, and our (a/e)ffector organs are bottlenecking our impact. To be clear, it's not that these roles aren't mentally stimulating - they are. It's just that their impact lies primarily in implementing ideas, and uncovering practical considerations, rather than in an Ivory tower's pure, deep thinking.

The world is quickly becoming polarised between US and China, and this means that certain (a/e)ffector organs may be even more neglected than the others. We may want to promote: i) work as a diplomat ii) working at diplomat-adjacent think tanks, such as the Asia Society, iii) working at relevant UN bodies, relating to disarmament and bioweapon control, iv) working at UN-adjacent bodies that seek to pressure disarmament etc. These roles often reside in large entities that can accept hundreds or thousands of new staff at a wide range of skill levels, and so perhaps many people who are currently “earning to give” should move into these “affector” or “effector” roles (as well as those mentioned above, in other relevant parts of national governments). I'm also curious whether 80,000 Hours has considered diplomatic roles - I couldn't find much on a cursory search.

There's a new center in the Department of State, dedicated to the diplomacy surrounding new and emerging tech. This seems like great place for Americans to go and work, if they're interested in arms control in relation to AI and emerging technology.

Confusingly, it's called the "Bureau of Cyberspace Security and Emerging Technologies (CSET)". So we now have to distinguish the State CSET from the Georgetown one - the "Centre for Security and Emerging Technology".

This framing is not quite right, because it implies that there's a clean division of labour between thinkers and doers. A better claim would be: "we have a bunch of thinkers, now we need a bunch of thinker-doers".

Thanks for this. 

I've also been thinking about similar things - e.g. about how there might be a lot of useful things EAs could do in diplomatic roles, and how an 80k career profile on diplomatic roles could be useful. This has partly been sparked by thinking about nuclear risk. 

Hopefully in the coming months I'll write up some relevant thoughts of my own on this and talk to some people. And this shortform post has given me a little extra boost of inclination to do so.

[Maybe a bit of a tangent]

A Brookings article argues that (among other things):

  1. A key priority for the Biden administration should be to rebuild the State Department's arms control workforce, as its current workforce is ageing and there have been struggles with recruiting and retaining younger talent
  2. Another key priority should be "responding to the growing anti-satellite threat to U.S. and allies’ space systems". This should be tackled by, among other things:
    • "tak[ing] steps to revitalize America’s space security diplomacy"
    • "consider[ing] ways to expand space security consultations with allies and partners, and promote norms of behavior that can advance the security and sustainability of the outer space environment"
    • (Note: It's not totally clear to me whether this part of the article is solely about anti-satellite threats or about a broader range of space-related issues.)

This updated me a little bit further towards thinking it might be useful: 

  • for more EAs to go into diplomacy and/or arms control
  • for EAs to do more to support other efforts to improve diplomacy and/or arms control (e.g., via directing funding to good existing work on these fronts)

Here's the part of the article which is most relevant to point 1:

The State Department’s arms control workforce has been under stress for some time due to problems associated with an aging staff and the inability to effectively recruit and retain younger talent. For example, a 2014 State Department Inspector General report on the Bureau of Arms Control, Verification, and Compliance states: “Forty-eight percent of the bureau’s Civil Service employees will be eligible to retire in the next 5 years, the second-highest percentage in the Department of State … Absent a plan to improve professional development and succession planning for the next generation of arms control experts, the bureau is at risk of losing national security expertise vital to its mission.”

Though many of the challenges associated with the arms control workforce pre-date the Trump administration, according to press reports, these trends have accelerated under its watch. As a result, the Biden administration will inherit an arms control workforce that has been hollowed out. A key priority for the incoming team must be to rebuild this workforce. Luckily, the Under Secretary of State for Arms Control and International Security has the authority under the Arms Control and Disarmament Act to hire technical arms control experts through an expedited process. In the near-term, the State Department should take advantage of this and other existing hiring authorities to help rebuild the arms control workforce. Over the longer term, it should work with Congress to determine whether new hiring authorities would help grow and maintain the arms control workforce.

Another relevant one in the US Dept of State.

Hacking Academia.

Certain opportunities are much more attractive to the impact-minded than to regular academics, and so may be attractive, relative to how competitive they are.

  • The secure nature of EA funding means that tenure is less important (although of course it's still good).
  • Some centers do research on EA-related topics, and are therefore more attractive, such as Oxford, GMU.
  • Universities in or near capital cities, such as Georgetown, UMD College Park, ANU, Ghent, Tsinghua or near other political centers such as NYC, Geneva may offer a perch from which to provide policy input.
  • Those doing interdisciplinary work may want to apply for a department that's strong in a field other than their own. For example, people working in AI ethics may benefit from centers that are great at AI, even if they're weak in philosophy.
  • Certain universities may be more attractive due to being in an EA hub, such as Berkeley, Oxford, UCL, UMD College Park, etc.

Thinking about an academic career in this way makes me think more people should pursue tenure at UMD, Georgetown, and Johns Hopkins (good for both biosecurity and causal models of AI), than I thought beforehand.

Which longtermist hubs do we most need? (see also: Hacking Academia)

Suppose longtermism already has some presence in SF, Oxford, DC, London, Toronto, Melbourne, Boston, New York, and is already trying to boost its presence in the EU (especially Brussels, Paris, Berlin), UN (NYC, Geneva), and China (Beijing, ...). Which other cities are important?

I think there's a case for New Delhi, as the capital of India. It's the third-largest country by GDP (PPP), soon-to-be the most populous country, high-growth, and a neighbour of China. Perhaps we're neglecting it due to founder effects, because it has lower average wealth, because it's universities aren't thriving, and/or because it currently has a nationalist government.

I also see a case for Singapore - that it's government and universities could be a place from which to work on de-escalating US-China tensions. It's physically and culturally not far from China. As a city-state, it benefits a lot from peace and global trade. It's by far the most-developed member of ASEAN, which is also large, mostly neutral, and benefits from peace. It's generally very technocratic with high historical growth, and is also the HQ of APEC.

I feel Indonesia / Jakarta is perhaps overlooked / neglected sometimes, despite it being expected to be the world's 4th largest economy by 2050:

Jakarta - yep, it's also ASEAN's HQ. Worth noting, though, that Indonesia is moving its capital out of Jakarta.

Yes, good point! My idle speculations have also made me wonder about Indonesia at least once.

I'd be curious to discuss if there's a case for Moscow. 80,000 Hours's lists being a Russia or India specialist under "Other paths we're excited about". The case would probably revolve around Russia's huge nuclear arsenal and efforts to build AI. If climate change were to become really bad (say 4 degrees+ warming), Russia (along with Canada and New Zealand) would become the new hub for immigration given it's geography  -- and this alone could make it one of the most influential countries in the world.

EAs have reason to favour Top-5 postdocs over Top-100 tenure?

Related to Hacking Academia.

A bunch of people face a choice between being a postdoc at one of the top 5 universities, and being a professor at one of the top 100 universities. For the purpose of this post, let's set aside the possibilities of working in industry, grantmaking and nonprofits. Some of the relative strengths (+) of the top-5 postdoc route are accentuated for EAs, while some of the weaknesses (-) are attenuated:

+greater access to elite talent (extra-important for EAs)

+larger university-based EA communities, many of which are at top-5 universities

-less secure research funding (less of an issue in longtermist research)

-less career security (less important for high levels of altruism)

-can't be a sole-supervisor of a PhD student (less important if one works with a full-professor who can supervise, e.g. at Berkeley or Oxford).

-harder to set up a centre (this one does seem bad for EAs, and hard to escape)

There are also considerations relating to EAs' ability to secure tenure. Sometimes, this is decreased a bit due to the research running against prevailing trends.

Overall, I think that some EAs should still pursue professorships, especially to set up research centres, or to establish a presence in an influential location but that we will want more postdocs than is usual.

A quite obvious point that may still be worth making is that the balance of the considerations will look very different for different people. E.g. if you're able to have a connection with a top university while being a professor elsewhere, that could change the calculus. There could be numerous idiosyncratic considerations worth taking into account.

I once got the advice from highly successful academics (tenured ivy league profs) that if you want become an academic you should "resist the temptation of the tenure track for as long as possible" and rather do another post-doc. 

Once you enter the tenure track, the clock starts ticking and by the end of it, your tenure will be judged by your total publication record. If you do (another) postdoc before entering the tenure track you'll have more publications in the pipeline, which will give you a competitive edge. This might also increase your chances of getting more competitive professorship. 

By the same token, it perhaps pays to do pre-doctoral fellowships and master's degrees. This is also important for picking a Euro vs. US PhD where the 3 year Euro PhD might better for people who do not want to go into academia whereas the 5 year+ US  PhD might be better for academia.

This is probably overstated—at most major US research universities, tenure outcomes are fairly predictable, and tenure is granted in 80-95% of cases. This obviously depends on your field and your sense of your fit with a potential tenure-track job, though.

https://dynamicecology.wordpress.com/2014/07/21/dont-worry-too-much-about-whether-youll-get-tenure-because-you-probably-will/

That said, it is much easier to do research when you're at an institution that is widely considered to be competitive/credible in your field and subfield, and the set of institutions that gets that distinction can be smaller than the (US) top 100 in many cases. So, it may often make sense to go for a postdoc if you think it'll increase your odds of getting a job at a top-10 or top-50 institution.

Possible EA intervention: just like the EA Forum Prizes, but for the best Tweets (from an EA point-of-view) in a given time window.

Reasons this might be better than the EA Forum Prize:

1) Popular tweets have greater reach than popular forum posts, so this could promote EA more effectively

2) The prizes could go to EAs who are not regular forum users, which could also help to promote EA more effectively.

One would have to check the rules and regulations.

The Emergent Ventures Prize is an example of a prize scheme that seems good to me: giving $100k prizes to great blogs, wherever on the internet they're located.

I read every Tweet that uses the phrase "effective altruism" or "#effectivealtruism". I don't think there are many EA-themed Tweets that make novel points, rather than linking to existing material. I could easily be missing Tweets that don't have these keywords, though. Are there any EA-themed Tweets you're thinking of that really stood out as being good?

Tom Inglesby on nCoV response is one recent example from just the last few days. I've generally known Stefan Schubert, Eliezer Yudkowsky, Julia Galef, and others to make very insightful comments there. I'm sure there are very many other examples.

Generally speaking, though, the philosophy would be to go to the platforms that top contributors are actually using, and offer our services there, rather than trying to push them onto ours, or at least to complement the latter with the former.

I agree with this philosophy, but remain unsure about the extent to which strong material appears on various platforms (I sometimes do reach out to people who have written good blog posts or Facebook posts to send my regards and invite them to cross-post; this is a big part of Ben Kuhn's recent posts have appeared on the Forum, and one of those did win a prize). 

Aside from 1000-person-plus groups like "Effective Altruism" and "EA Hangout", are there any Facebook groups that you think regularly feature strong contributions? (I've seen plenty of good posts come out of smaller groups, but given the sheer number of groups, I doubt that the list of those I check includes everything it should.)

*****

I follow all the Twitter accounts you mentioned. While I can't think of recent top-level Tweets from those accounts that feel like good Prize candidates, I think the Tom Inglesby thread is great!

One benefit of the Forum Prize is that it (ideally) incentivizes people to come and post things on the Forum, and to put more effort into producing really strong posts. It also reaches people who deliberately worked to contribute to the community. If someone like Tom Inglesby was suddenly offered, say, $200 for writing a great Twitter thread, it's very unclear to me whether this would lead to any change in his behavior (and it might come across as very odd). Maybe not including any money, but simply cross-posting the thread and granting some kind of honorary award, could be better.

Another benefit: The Forum is centralized, and it's easy for judges to see every post. If someone wants to Tweet about EA and they aren't already a central figure, we might have a hard time finding their material (and we're much more likely to spot, by happenstance, posts made by people who have lots of followers).

That said, there's merit to thinking about ways we can reach out to send strong complimentary signals to people who produce EA-relevant things even if they're unaware of the movement's existence. Thanks for these suggestions!